DHS Office on Accessible Systems and Technology

Guide for Accessible Software Applications

Version 1.0

October 2006

Introduction 3

Purpose 4

Instructions 4

Step 1 – Identify and Record the functional tests 5

Step 2 – Perform Accessibility Test 6

Step 3 – Scoring Results and Reporting 7

Assumptions 7

Possible test Outcomes 9

Testing methods 9

Basic Navigation 9

System Settings 9

Visual Display 10

Technical Assessments 11

Basic Navigation 11

1194.21 (a) – Keyboard Access 12

1194.21 (c) – Focus 12

1194.21 (d) – Interface Elements 13

1194.21 (e) – Bitmap Images 14

1194.21 (f) – Text Information 15

1194.21 (l) – Electronic Forms 15

System Settings 16

1194.21 (b) – Accessibility Features 17

1194.21 (g) – User Selected color and contrast settings 17

Visual Display 18

1194.21 (h) – Use of Animation 18

1194.21 (i) – Color Coding 19

1194.21 (j) – Color and Contrast Adjustments 20

1194.21 (k) – Screen Flicker 20

Template A and B Instructions 22

Attachment A – Functional tasks and scores 25

Attachment B – Screen Level Review 27

Attachment C – Keyboard Commands 29

Keyboard shortcuts for Windows* 29

Windows system key combinations 29

Windows program key combinations 29

Mouse click/keyboard modifier combinations for shell objects 29

General keyboard-only commands 29

Shell objects and general folder/Windows Explorer shortcuts 30

General folder/shortcut control 30

Windows Explorer tree control 31

Properties control 31

Accessibility shortcuts 31

Microsoft Natural Keyboard keys 31

Microsoft Natural Keyboard with IntelliType software installed 32

Dialog box keyboard commands 32

Microsoft Internet Explorer commands 32

Introduction

Section 508 of the Rehabilitation Act of 1973, as amended 1998, requires that the Federal government provide comparable access to, and use of, information and data to Federal employees and members of the public with disabilities as that provided to those without disabilities. Information and data is most often accessed or used via some electronic and information technology, such as a web site, software application or operating system, telecommunications device, video or multimedia product, stand-alone electronic equipment, or desktop or portable computer. People with disabilities often rely upon assistive technology products such as screen readers for the blind, speech input applications for those with motor control or speech or communications disabilities, or teletype writers, (TTY)s for the deaf or hard-of-hearing, to access information and data via these electronic and information technology systems.

The U.S. Access Board developed functional performance criteria and supporting technical standards for electronic and information technologies to facilitate implementation of this law. These functional performance criteria and technical standards were adopted into Federal Acquisitions Regulations, (FAR), and are located at CFR1194.21-26, 31, and 41. The functional performance criteria require that electronic and information technologies are usable for people who may be blind or have low vision, are deaf or hard-of-hearing, have limited motor control or dexterity, or have speech or communications disabilities. These are the core requirements to achieve compliance with the Section 508 portion of the Rehabilitation Act of 1973.

What requirements apply to Software Applications?

Functional performance criteria supporting users with disabilities as defined in 1194.31 is the requirement for satisfying Section 508. There are six specific technical standards were defined that are designed to help meet the functional performance requirements. Selecting which technical standards is therefore critical to pass Section 508. This guideline is intended to address both traditional “Software” and non-traditional “Web” applications that perform similar operations and functions of traditional software.

Since Software development is performed utilizing a multitude of programming languages, specific code examples and techniques are not provided. The guide will remain a good source on determining many accessibility features, but a recommendation is to seek a subject matter expert to utilize assistive technology to navigate and inspect the application or site during testing.

Assessing Web based applications that use technical approaches such as Macromedia Flash, Acynronis JavaScript and XML (AJAX), Dynamic Hypertext Markup Language (DHTML) or other dynamic interface requires the selection of “Web” and “Software” technical standards to meet the functional performance criteria.

Purpose

This guide is intended to provide a process to determine Section 508 compliance for software items, and the risk-based impact assessment of any compliance gaps found for people with disabilities. Understanding the Section 508 compliance level is needed when evaluating items for purchase, deployment, or usage or maintenance. Understanding the impact of Section 508 compliance gaps is critical to understanding the disabilities affected by such gaps.

In further detail, this guide provides a process and documentation template to identify applicable Section 508 standards, and the compliance of the item for each of those standards. The guide also provides an impact assessment template to document what functional performance criteria are affected by gaps in technical compliance, and therefore which people with disabilities are most likely to lack access due to those compliance gaps.

The provided testing process best practices are here as a guide, but shall be applied to each Software application screen or applicable Web page individually to ensure accuracy of the results.

Instructions

In order to complete testing in this guideline, it requires the creation and execution of the functional tests using the technical assessments found later in this document. After completing the technical assessments, the template documents located in the appendix can be used to record results and determine the level of accessibility of a particular application. The process is separated into three steps of identifying and recording the functional tests, performing the technical assessments and then recording results.

The following terms are used within this document and have the following meaning:

Functional test – A test case created by the auditor. The test cases are closely related to actual requirements criteria of an application (these are tests designed to address whether the applications intended use, can be equally accessed by users with disabilities).

Technical assessments – The process described in this document that provides guidance on applying the requirements of Section 508 to the functional tests.

At the completion of the guided process, the following documents will be completed:

·  One appendix “A” document listing the “Task ID”, “Task Description”, “Score”, and whether the task was determined to be “Critical” for each task created as well as a impact summary.

·  Multiple appendix “B’s” (one for each functional task or “Task ID”) recorded on attachment A.

Step 1 – Identify and Record the functional tests

Creating functional tasks shall be drafted by the auditor that is based on the “purpose” of the application. Identifying the purpose of the application can be found in procurement documents such as the “system requirements” or “acceptance test criteria”. When these documents are not available, other resources may exist within the application itself, such as reviewing the topics within a “help” system. When no documented resources are available, manual navigation and manual interpretation of what’s being performed is a method to help determine functional tasks. In summary, this order is suggested when creating the functional tests:

·  Procurement documents (requirements, acceptance test criteria (use case scenarios)

·  Applications help system or pre-defined application map or site map

·  Manual review of the available features

In addition to functional tasks identified, this document relies on existing test methodologies of the group conducting the test. The test methodologies should at a minimum have a defined process to execute, record, and evaluate results so that they are measurable and repeatable.

Once functional tasks are identified, attachment “A” should be used to record each of these tasks with an assigned task ID. Examples of some functional tasks for a fictitious time & attendance application are as follows:

·  Ability to log-in to the application

·  Ability to enter daily, weekly or monthly time activity

·  Ability to record notes associated with each time entry

·  Ability to save the information entered into the system

·  Ability to electronically sign the time-sheet

·  Ability to forward the information to a supervisor for approval

Once these tasks are recorded on attachment “A”, the actual functional tasks shall be identified as “critical” or “not critical”. The auditor shall make a determination whether the identified functional task is something that “must exist” to use the application by asking the question “If I eliminated that particular feature, would the application be useful to people in their daily work routine?” For instance, a log-in screen would almost always be critical, but the ability to add some customized fonts to a report may not be critical. The following are some examples of questions that are likely to indicate a critical task:

·  Is the function required to enter or exit the application?

·  Is the function an expected “daily task” (save, load, update, read, share, search etc.)?

·  Is the application being distributed or being made public?

·  Is the application being distributed being deployed throughout DHS internally?

When identifying non-critical task, it would often require identifying functions that are not used often and have other conditions. This does not mean that violations are “not a violation”, but just sets the level of importance. The following list is guidance on determining common “non-critical” status:

·  The application is used primarily by a group of users who are not permitted to have severe disabilities due to the nature of their work. An example would be sworn law enforcement personnel, military personnel, and inspectors monitoring camera’s etc. When assessing this, a good threshold would be to determine that at least ninety percent (90%) of the normal users will be in this special category. As with most applications, support staff usually exists and pose a risk and law enforcement is not an exception to Section 508, but has a reduced level of risk for non-accessibility

·  The feature that is not accessible on the application is used for “advanced” configuration or settings not a feature that is normally used in the day-to-day operations of the application or specific to the user.

In summary Step 1 will require the following two entries on Attachment A:

·  The functional task description and unique task ID

·  Critical assessment (Checkbox available for “critical” task)

Step 2 – Perform Accessibility Test

Each of the functional tasks identified shall be performed by the auditor while following the technical assessments described in this document. Each functional task shall have at least one associated attachment “B” that is used to record the results of the technical assessments. If multiple screens or pages are encountered during one functional task, the auditor may want to utilize several attachment “B’s” to help record and localize any known problems that can assist in remediation efforts.

Attachment “B” is organized according to the technical assessments identified in this document. The technical assessment is divided into three groups:

·  Basic navigation

·  System settings

·  Visual display

The results of each of the technical assessments shall be entered on attachment “B” as compliant, non-compliant or not applicable.

In summary, step 2 requires the following:

·  Perform technical assessments on each functional test.

·  Record results for each functional test on an individual attachment “B” having results of compliant, non-compliant, or not applicable results.

·  Determine impact for any critical task failed and update the impact summary on Attachment “A” listing the status as “Red” as well as determining the disabilities affected by the failure.

Step 3 – Scoring Results and Reporting

On attachment “B” there is an area that is designed to assist with the calculation of a “score” assigned to the individual functional task. The process for scoring is to take the number of applicable standards and compare against the assessments that were compliant.

The results of the score shall then be transferred to attachment “A” in the column “Score from Attachment B”. The final part of each assessment is the impact determination to flag a critical failure as well as determine the likely disabilities that were affected by the failure. Once all the functional tests are complete and recorded on the multiple attachment “B’s”, scores are transferred to attachment “A”, and the impact determinations were made, the test process is complete. If no critical failures were encountered after all the tests, then the impact determination shall be completed on attachment “A” selecting either a level of full compliance (Green) or less than full compliance with no critical failures (Yellow). Attachment “A” is then ready for use as a snap-shot of the Section 508 compliance of the application.

In summary, step 3 will require the following actions:

·  Calculate score based on applicable standards in comparison to compliant findings on attachment “B”

·  Transfer scores from attachment “B” to attachment “A” column titled “Score from Attachment B”.

·  Update the impact summary on Attachment A.

The end deliverable exists with the following documents:

·  Attachment “A” having all the functional tasks listed, scored, and “critical” tasks identified and impact summary for affected disabilities.

·  Multiple attachment “B’s” with at least one for each functional task.

Assumptions

Special considerations when applying Software standards to Web based applications using techniques such as Asynchronous JavaScript And XML (AJAX) or technologies such as Flash should be identified. Web based applications such as these are in the unique position of having two technical standards (Web and Software) that must be followed to accomplish the required Functional Performance Criteria (1194.31). This guideline relies on the use of the Software standard as the primary method of evaluating these types of applications and any similar standards should be mapped accordingly. Similar standards and mappings include the use of color, screen flicker and input fields. The following table maps these requirements:

Standard / Web / Software (prevails)
Use of color to convey information / 1194.22(c) / 1194.21(i)
Screen flicker / 1194.22(j) / 1194.21(k)
Form fields and input controls / 1194.22(n) / 1194.21(l)

In all instances where the term “Software Application” is used, it equally applies to both traditional software applications and “Web” applications that have been determined to be dynamic, AJAX or utilize proprietary technology such as Flash.