Editor’s Note: Justin Alexander is the Web Accessibility Lead at ITHAKA, with a focus on improving the experience for all individuals. Previously working to advance accessibility within Michigan State University’s IT department, he came to ITHAKA with a background in both User experience and Accessibility.

Roughly one in five adults in the U.S. reports having a disability, and on the whole, those with disabilities are more challenged to use digital media. In the scholarly publishing sector, ensuring the accessibility of platforms and systems to those with disabilities is of paramount importance. And yet, creating a platform that can be accessed by everyone can be difficult. At ITHAKA, we are responsible for the accessibility of a number of services, including JSTOR, a highly used platform for research and teaching. Our experience suggests that conducting a major annual accessibility audit, an approach taken by many organizations, may not be the best method. Instead, we have developed a different solution that has proven to be both more efficient and to yield better results.

overhead view of people

ITHAKA is a not-for-profit with a mission to improve access to knowledge and education for people around the world. Our largest platform, JSTOR, provides access to more than 12 million academic journal articles, books, and primary sources in 75 disciplines, with over 190 million visits last year. We take accessibility seriously as a core part of our mission. We also try to learn continuously and improve processes whenever possible.

Like many other organizations, our original process was to have a cross-team accessibility audit once a year. This process consisted of multiple engineering teams reviewing our JSTOR platform using the Web Content Accessibility Guidelines (WCAG). For those who may be unfamiliar with these guidelines, an example of the type of requirement included here is that our platform needs to be operable using just a keyboard, ensuring user actions and workflows are expected.

We did these annual audits with an Accessibility team of two individuals with backgrounds in user experience and accessibility supporting an engineering team of 50+ members. After demonstrating how to test for certain guidelines, the review process took several weeks in which any issues were identified. Engineering team members would then make any necessary adjustments to improve platform accessibility.

This process had several drawbacks:

  1. This method is not agile. It took multiple months to complete. Work on other priorities stalled so that these important issues could be addressed.  Working in an agile environment, we want our work to keep progressing — not to have to jump from priority to priority.
  2. This method introduces the potential for out of date information regarding the accessibility of JSTOR. Changes to our platform are made continuously throughout the year. If updates are not checked for accessibility, there is potential for accessibility violations to arise between audits.
  3. Uncertainty of standards. It takes time to learn about a new topic, which has to happen anew if the work only happens once a year. In our environment, this means training 50+ engineering team members in a few sessions, which understandably can leave some outstanding questions and uncertainties.
  4. Some parts of the site are used more than others. As with every platform, different pages see a variety of usage and update frequency. Pages that see higher usage could hinder more experiences. Additionally, updates need to be accessible to ensure an exceptional experience for all. Reviewing all pages at the same frequency leaves a window for issues to arise.

Taking away these lessons, our Accessibility team got to work optimizing our audit process. Gathering feedback from audit participants, the Accessibility team started to iterate. Our initial thought was to do similar scale reviews of JSTOR — just more often. This method would catch problems sooner but would not address all of the gaps we identified.

Our solution instead is an ongoing process, which we call a continuous systems check (CSC). The CSC consists of small audits done much more frequently — twice a week. Small, frequent reviews have been a large improvement compared to our initial process. We now use usage data to categorically weigh our pages. All pages are still audited once a year, with higher trafficked pages reviewed twice a year or even quarterly.

The CSC solves all four previous issues:

  1. Teams receive smaller issues to remediate more often. The planned fixes are incorporated into planned engineering “sprints,” which helps teams balance accessibility improvements with other work and make progress on ALL work. It allows us to focus on working with teams to ensure updates are accessible when they are added to the platform. We have achieved success by helping teams understand accessibility topics and incorporate them earlier into the development process.
  2. Reviewing most pages more often allows us to be more accurate. Legal documents, such as our Voluntary Product Accessibility Template (VPAT), can be modified more frequently with updated information. There is a more transparent understanding of JSTOR’s accessibility, both internally and externally, by having up to date information.
  3. The Accessibility team handles the whole auditing process which ensures our audits are accurate. With teams continuously improving our platforms, it allows us to improve accessibility throughout the whole development process, rather than just a final check.
  4. Our most viewed pages get reviewed more frequently. This ensures our most trafficked pages are checked more often and ensures they remain accessible.

ITHAKA launched the CSC at the beginning of 2021. Over the course of the year, we avoided building up “technical debt” associated with accessibility, incorporated accessibility into our regular processes, and as a result we improved both the efficiency and the outcomes of our accessibility work. Additionally, we updated our VPAT multiple times and remediated issues throughout the year, rather than just once.

Our CSC process improved upon our old process and also helped to advance awareness of accessibility within our organization. Implementing a process that is easier to manage has allowed us to act in advance of new deployments, ensuring platform updates are accessible when released. We believe this model is unique and may be beneficial to others in creating products that are accessible to all. It is more efficient, manageable, and facilitates ongoing accessibility improvements. It shifts the focus of our day to day work to partnering with engineering teams to improve accessibility as we build a product, rather than facilitating fixes. And the greatest benefit, of course, is to the users we are working to serve.

In the spirit of sharing, we would love to go more in depth or answer any questions others may have. Please feel free to reach out to our Accessibility team by contacting us at accessibility@ithaka.org

Justin Alexander

Justin Alexander is the Web Accessibility Lead at ITHAKA, with a focus on improving the experience for all individuals. Previously working to advance accessibility within Michigan State University’s IT department, he came to ITHAKA with a background in both User experience and Accessibility.


2 Thoughts on "Guest Post — Shifting Away from Yearly Accessibility Audits: How Can a Better System be Implemented?"

This is a really thought-provoking piece – thanks for sharing. It also addresses the point that accessibility all-too-often gets pushed aside because of other priorities. I like the innovative approach you are taking and the honesty with which you are addressing some of the challenges. The ‘not agile’ piece particulalry resonated with me.

Thank you for your response, Simon. I appreciate your input on the post as well as our new process. Please let me know if you would like to discuss further.


Comments are closed.