Skip to content

Accessibility Testing in AMY

Automated Testing

The automated tests aim to test one page corresponding to each template in amy/templates (including base templates and the includes subfolder). New templates should have a corresponding URL added to the .pa11yci config file. As such, templates added since the last update to .pa11yci are not tested.

Workflow

The GitHub Actions workflow includes options for testing with both pa11y and Google Lighthouse. In both cases, the report can be found in the artifacts attached to the workflow run. The report contains one HTML file for each page covered by the tests, plus a summary file (something like index.html).

Local testing

See the AMY README for instructions on running the tests on your own machine.

Limitations

The following pages cannot be effectively tested automatically at the moment. Problems include:

  • page inaccessible to logged-in admin (`/terms/action_required``)
  • objects unavailable (e.g. no available signups for instructors to view)
  • file upload required (bulk uploads)
  • domain/slug required in URL, but organizations and events are randomly generated
"http://127.0.0.1:8000/terms/action_required/",
"http://127.0.0.1:8000/dashboard/instructor/teaching_opportunities/<int:recruitment_pk>/signup",
"http://127.0.0.1:8000/requests/bulk_upload_training_request_scores/",
"http://127.0.0.1:8000/requests/bulk_upload_training_request_scores/confirm/",
"http://127.0.0.1:8000/fiscal/organization/<str:org_domain>/",
"http://127.0.0.1:8000/workshops/event/<slug:slug>/",
"http://127.0.0.1:8000/workshops/event/<slug:slug>/review_metadata_changes/",
"http://127.0.0.1:8000/workshops/event/<slug:slug>/delete/",
"http://127.0.0.1:8000/workshops/event/<slug:slug>/edit/",
"http://127.0.0.1:8000/workshops/persons/bulk_upload/confirm/",
"http://127.0.0.1:8000/workshops/event/<slug:slug>/validate/",

Manual Testing Workflow

This assumes you are testing in Google Chrome on a Windows computer. It's also possible to test in other browsers and OSs (indeed, this is encouraged), but some tools and features may differ (e.g. Device Mode). The AMY team are not experts, so this workflow is likely to evolve over time.

  1. Bookmark the Web Content Accessibility Guidelines 2.1, which we'll be using as a reference throughout.
  2. Create a spreadsheet which lists all the criteria, so you can mark each one passed or failed. Alternatively, use the WCAG-EM Report Tool, but note that it won't save your data if you close the page.
  3. Install the WAVE and axe devTools extensions (you don't need the Pro version of axe devTools)
  4. Run WAVE and axe devTools on the target page. Where there are failures, the extensions will state the associated WCAG criterion - mark this criterion failed.
  5. Run Lighthouse and/or Pa11y on a local version of the page (see Run accessibility tests locally), and note any failures. These should broadly match the output of axe devTools, but each package is slightly different in what info it provides.
  6. Step through the A11y Project Checklist and note any failures. Each checklist item is matched to a WCAG criterion.
  7. Use the Nu Html Checker to validate the HTML. You can do this by setting the input to 'Check by text input' and copy-pasting the test page's source code into it. Note: don't do this with the DJDT panel enabled, it makes the source code much larger and the checker will crash. (WCAG criterion: 4.1.1 Parsing)
  8. Use the Device Mode features in Chrome DevTools to simulate a mobile viewport. Use the 'Mobile S - 320px' width preset (on the bar just below the dimensions). Check if the page scrolls in two dimensions (bad) and if any content or functionality is lost. (WCAG criteria: 1.3.4 Orientation and 1.4.10 Reflow)
  9. Use a screen reader (e.g. NVDA, which is free) and keyboard-only navigation to operate the page. Is anything confusing or inaccessible? Map it to the appropriate WCAG criterion. (This step requires some practice with NVDA to effectively model how full-time screen reader users navigate pages. See Further Reading below.)
  10. Manually study the page to fill in any gaps that need some human assessment (e.g. WCAG 1.3.5 Identify Input Purpose, 2.4.5 Multiple Ways, 2.5.x Input Modalities, 3.2.3 Consistent Navigation, 3.2.4 Consistent Identification, 3.3.x Input Assistance)
  11. If the page has multiple states (e.g. a dropdown expanded, date picker activated, content appearing on hover/focus, etc.), repeat the process with the page in each state.
  12. If unsure if a pattern is a failure of a particular criterion X, read the 'Understanding X' and 'How to Meet X' pages, linked from X in the WCAG 2.1 document. If still unsure, make a note and move on.

Further Reading