Expert Answers to 5 Common Digital Accessibility Questions

Icon of a checklist of accessibility features including sound visuals and closed captioning

Across Canada, dedicated public servants are working hard to create digital public services that meet the needs of the public.

But when it comes to ensuring services are accessible to people with disabilities, those same public servants often struggle with a lack of resources, training and tools. 

Last month, Code for Canada brought together three experts for a conversation about what governments can do to create truly accessible services. Attendees heard from: 

The hour-long conversation only scratched the surface of this important topic, so we followed up with them to get their answers to questions posed by public servants across Canada.

To what extent should automated accessibility tools be used VS manual checking? Is there a best practice on this?

Sheldon Bauld: Automated tools should be used as much as possible — every release would be recommended, as it should be part of your QA process alongside other things you’d typically check. Manual checking should also be done as much as possible, but I would say a good place to start is with major releases. If a significant change to the experience is ready for launch, it’s worth checking it manually with assistive technologies — even better with real end users. 

Scott Herron: Automated testing is one of the tools you can use, but it should never be used on its own, as it's not a great tool for identifying any accessibility violations or errors. If you're extremely lucky, you may be able to have the tool pick out between 30% to 50% of the errors. This is the time to advocate for user testing with all different groups of humans, to ensure you include people with disabilities as part of your process to build a great experience.

Pina D'Intino: Automated testing can be the first step to accessibility testing, especially if you are reviewing many webpages and are looking to address web compliance, such as WCAG 2.1. However, automated testing should never be used in isolation. When you are developing websites, automated testing using some of the more common open source tools, such as Axe Pro, A-Checker, Site Improve and many others, can help identify defects early on. 

Test webpages with automated scanning tools combined with the use of assistive technology such as screen readers, and complete your testing with people with different lived experiences or disabilities. This is the best way to ensure that you end up with a website that meets the needs of the broadest number of users.

I struggle with stakeholders who want to be compliant with WCAG deprioritizing or eliminating very necessary fixes as "just usability." How do you change their mind?

Sheldon Bauld: You don’t. I would encourage the designers and developers involved to find ways to sneak the work in. Incorporate it into quality assurance processes. Include it in the definitions of done if you’re working Agile. Find ways to simply make it part of the work. 

Some people never come around. It also helps to find anyone internally that can help champion corporate policy around accessibility. I’ve seen in the past, teams that can get their first usability test in usually don’t have to fight that hard afterwards. It’s always breaking through with that first test and sharing that experience and results as broadly as possible.

Scott Herron: This is always a challenge at any time because organizations are at different points in their accessibility maturity. Validating your user flow is absolutely critical to show that gaps exist in the experience. There are a number of ways to share this — if you have disabled folks available, have them review the proposed experience and share video clips, or share during a lunch andn learn to show the gaps in the experience. 

You may also want to remind leaders that a miss now only makes more work later, once the deprioritized piece of work is not part of your build. This may cause complaints and brand harm if it is truly a large gap that is created — for example, not building a well-defined form or payment option.

Pina D'Intino: Remind them that automated tools only pick up 35 to 50% of defects. Would they leave 50% of their users out of using or purchasing their products? Accessibility testing is part of usability. While something can be usable, it may not be accessible, and something that is accessible may not be usable. 

Lastly, remediation can be very costly. Estimates can vary from 20 to 80% of the development cost. These are just some ways you can get leadership on board. Another good practice is to let them see how someone with a disability uses their website. It is quite an eye opening experience for many. Leverage your ERG’s to help if possible.

What accessibility-oriented research, design or other activities do you think should be done before a product gets to the usability testing stage? 

Sheldon Bauld: Usability testing is my most recommended research method for assessing usability in any project. I recommend you do it at least once in one of these phases of development:

Scott Herron: Ideally, you should incorporate this fully into your processes, both from a product development and software development standpoint, in your daily work. You might want to draw attention to having a person with a disability partner with both design and engineering to ensure all accessibility and usability needs are ment prior to any releases. The same applies to research and I highly recommend a cross-section of differently disabilities being part of all user research to ensure you obtain diverse perspectives prior to completing any userflows.

When recruiting for usability testing, should we be including as many accessibility groups as possible, or should we be focusing on individual areas in one study?

Sheldon Bauld: Ideally, users recruited for usability tests cover the broadest spectrum of potential users of your product. Age, gender, relevant physical and cognitive limitations, assistive technology experience, location, and experience with the problem. Popular opinion in usability circles suggests that 5-7 users will uncover 70% of usability issues. In my experience, this holds true as long as a) you’re not trying to test too many things at once, and b) the segmentation of users is done with a fair amount of rigour.

Pina D'Intino: It would be impossible to recruit for all disabilities.  Frequently, testers will focus on screen reader users, those having low vision and people with cognitive or learning disabilities. I would argue, however, that it depends on your product and who it is intended for as a priority group of testers. I would also recommend that you vary your testers if possible, again based on the fact that many experience disability differently. 

How has AI impacted your personal experiences with assistive technology? How do you envision its role evolving to further enhance accessibility in digital design?

Scott Herron: To be honest, AI is more of a hindrance at this point with so many modals and popups interfering with my day-to-day activity. The builds that exist currently are quite intrusive, and many users struggle with cognitive load and user frustration. A couple of examples are the new Google Gemini release and, of course, good old Windows Cortana, which appear to slow your experience. As for future growth, this will be interesting to watch as AI is not a high focus point in the accessibility space currently, though this can change at any moment.

Pina D'Intino: AI can be both a good thing and a bad one. AI is yet another tool to have in your toolkit. It can help efficiency, reduce rework and identify fixes. Given that the algorithms used to train AI can be biased, it must be used with caution. Every disability is experienced differently, so replicating those experiences to then try to include them in AI equations that are based on repeatable patterns is something we need to be aware of. I  often use the example of spell checks. We can run a spell check, and it may come back with no errors, but it may easily confuse the use of words that are similar but may have very different meanings or interpretations.  

Interested in learning more?

Our Inclusive User Research service connects UX and development teams with diverse communities across Canada to test public digital services. Interested in learning how we can help? Get in touch!