Category Archives: Blog

Accessibility Metadata Project: Final Report

UPDATE (November 09, 2021)

This Project is no longer maintained here.  This project’s work has been moved to the W3C and has been taken up in a Community Group.  For more information please visit the new Accessibility Discoverability Vocabulary For Community Group.

To review or report issues with the Accessibility Properties for Discoverability Vocabulary, please refer to the vocabulary’s GitHub issue tracker.

Original Report (Outdated)

Submitted by Madeleine Rothberg, NCAM, WGBH

As the Accessibility Metadata Project funding with the Bill & Melinda Gates Foundation has concluded, we wish to thank them very much for their support of our work on the discoverability of accessible educational resources, and for their recognition of the importance of accessibility in furthering inclusive education.

Thanks to the Gates funding, we have made tremendous strides in our goals to develop standards for accessibility metadata and to have these standards accepted by, the organization that keeps a list of agreed-upon tags that all search engines can use in common so that users of those search engines can refine their searches to find exactly what they are looking for. Now that the standard set of tagging of online educational resources’ properties includes accessibility metadata, these properties have been picked up by the Internet Archive’s Open Library initiative, Hathi Trust Digital Library, and the Learning Registry, a leading metadata aggregation platform about online learning resources. We have also added accessibility metadata tags both to Bookshare and to a payload of metadata submitted to the Learning Registry. Bookshare now automatically submits accessibility metadata for Bookshare titles in the registry. Because of our reference implementation, Bookshare’s accessible content is more easily discoverable via online search and others are able to better understand how to make their content include accessibility metadata.

Beyond our grant commitment, we developed additional reference implementations and tools:

  • Searching for videos with closed captions: Before this project, it was not possible to search for captioned videos beyond the YouTube domain. By collaborating with the creator of the “WP YouTube Lyte” plug-in, which allows WordPress site administrators to automatically add accessibility properties to videos that have closed captions, we contributed code that allows for search based on accessibility. Now people who need captioned videos can easily find them on all WordPress sites using the plug-in using tools, such as Google’s Custom Search Engine.
  • Described video tagging: Smith-Kettlewell Eye Research Institute has developed a web-based video description product called YouDescribe that enables anyone to describe YouTube videos on the web. To assist people with visual impairments to easily discover videos described with the YouDescribe platform, Smith-Kettlewell is automatically tagging their videos with accessibility properties. Now search engines such as Google Custom Search Engine can index those properties.

The adoption of our project’s proposed set of accessibility metadata tags and the implementation successes I just listed are a tremendous milestone in the collaborative journey towards our vision of a “Born Accessible” world: a world in which all content born digital is made accessible—and discoverable—from the outset. Our implementations demonstrate that a broad adoption of accessibility metadata is possible.

Now that this groundwork has been laid, what is next? We must now encourage content management systems, publishers, search engines, and sites like Wikimedia to start using metadata in their sites, so that one day everyone will be able to find the great accessible content that is out there.  And, we have elements that we would still love to add – see the “properties under consideration” section of the specification. If you are interested in seeing this work progress, and would like to discuss your ideas for projects and next steps, please contact us by submitting a contact request in the project update sign-up form on this page.

WordPress “WP YouTube Lyte” Plug-in Now Supports Accessibility Properties

At Benetech, one of the Seven Truths we live by is, “partnership over going alone.” Our collaboration with Frank Goossens is a fantastic example of the effectiveness of that principle. Recently, Benetech created a patch for Frank’s very popular “WP YouTube Lyte” plug-in that enables WordPress site administrators to automatically add accessibility properties to videos that have closed captions. As Frank announced in his February 3rd blog post, that patch has now been incorporated into the plug-in. “If you have microdata enabled, WP YouTube Lyte now will automatically check if captions are available and if so, adds the accessibilityFeature property with value ‘captions’ to the HTML-embedded microdata.”  Translation: users of this plug-in for their WordPress sites can now make their captioned videos more easily discoverable by people who need them.

Compare Results: With and Without Accessibility Properties

To see what this looks like in practice, compare the following two search results:

  1. Run a search for closed-captioned videos on the Predictive Analytics Today website using Google’s closed-captions filter, which does not yet leverage accessibility properties. The result: “Your search – – did not match any video results.”
  2. Next, run the same search using Google Custom Search Engine (CSE), which does allow you to filter results by any properties, specifically in this case the accessibilityFeature property. This time, the search result correctly returns a link to the web site that includes the closed-captioned video you were looking for.

Try It Yourself: Captioned Video Search on

If you would like to experiment with other Google CSE searches for closed-captioned videos, check out the Captioned Video Search link on under the “Implementations” menu.  Be sure to always have the filter “more:p:videoobject-accessibilityfeature:captions” included in the search box without the quotes.

Encourage Adoption of Accessibility Metadata

The good news is, with both an accessibility metadata standard in place, and successful implementations like the WP YouTube Lyte plug-in enhancement, we know how to create a search function for accessible content and we know that it works.  The next step: Encourage other content management systems, publishers, and sites like the Internet Archive and Wikimedia to start using metadata in their sites so that one day everyone will be able to find the great accessible content that is out there now but can’t yet be found by those who need it.

Pay It Forward: The Rewards of Collaboration

I encourage everyone to engage in collaborations with their favorite content and service providers. Said Frank in his blog post about the project, “This was the first time someone actively submitted code changes to add functionality to a project of mine, actually. Working with Benetech was a breeze and GitHub is a great platform to share, review and comment code. I for one am looking forward to more high-quality contributions like this one!”

We agree with you Frank!

Learning Registry Blazes New Trails with Accessibility Metadata

As I announced earlier this month, recently adopted our proposal for accessibility metadata tagging that will make it possible for anyone with access to the Internet and a search engine to more easily find accessible content and applications on the Web. (See “ Accepts Our Proposal!” January 2014.)  Exciting work is already underway that leverages the power that such tagging provides. In the meantime, we are working on reducing the barriers to entry for anyone to tag content with accessibility metadata.

Fostering universal adoption of accessibility metadata

In order for users to feel confident that everyone can really find all (or even most) accessible resources using a search engine there are two hurdles to clear:

(1)    All digital content and applications with accessibility features must be tagged with accessibility metadata whenever such features are present (e.g. image descriptions, tactile images, video captioning, support for screen readers, and the like).

(2)    Major search engines, like Google, Yahoo, Bing, and Yandex, and vertical search products, such as the Federal Registry for Educational Excellence ( must support accessibility metadata and display the associated information to the user in search results.

This is a chicken-and-egg problem – there is no reason for search engines to support tagging if the tagging isn’t in the content and there is no reason for people to tag content if the search engines don’t support it. How to break that impasse? The first step is to have free, easy tools available that anyone can use to attach metadata to digital content.

EasyPublish: First publicly available tool to support accessibility metadata

The Learning Registry, a leading metadata aggregation platform about online learning resources, has recently enhanced their EasyPublish tool to allow anyone to tag any digital content with accessibility information. This marks the first time that a tool has been made freely available for anyone – includinAccessibility Informationg teachers, publishers, content creators, parents, or students – to take advantage of accessibility metadata tagging, which in turn will make it possible for the average person to easily discover accessible materials using online search engines.  Jim Klo’s note to the developers’ Google group for this project includes a link to the sandbox area for this tool. There you can explore how it works and enter sample data without making your entry live. Or, if you have a resource that you would like to tag for real, you can visit the production site to register content and attach accessibility metadata tags to digital content.  One exciting feature of the Learning Registry is that anyone can describe the accessibility of a resource, even if they are not the original publisher of that resource.

Bookshare automatically puts accessibility metadata into Learning Registry-powered sites

Anyone can put metadata into Learning Registry for any piece of digital content by using the EasyPublish interface described above – it can be done either manually or autoFeature filtersmatically using the API (Application Programming Interface) that the Learning Registry provides. Using the Learning Registry API, Bookshare now automatically submits accessibility metadata for Bookshare books into the Learning Registry. This lays the groundwork for users of Learning Registry-powered sites, such as the Federal Registry for Educational Excellence (, to filter search results for Bookshare titles based on specific accessibility features, such as described images or MathML.

Smith-Kettlewell demonstrates described video tagging

Smith-Kettlewell Eye Research Institute has developed a web-based video description product called YouDescribe that enables anyone to describe YouTube videos on the web.  To enable people with visual impairments to easily discover videos described with the YouDescribe platform, they have tagged their videos with accessibility properties.  Now those properties can be indexed by search engines such as Google Custom Search Engine.  Click here for an example search for tutorials about description.

Spread the word

We continue to be in dialogue with organizations such as Google and about the importance of supporting accessibility metadata tagging, but we can use all the help we can get. If you feel strongly about the importance of this initiative, please let your favorite search engine, publishers, and websites know how important it is to you to be able to easily find accessible materials on the web, including captioned videos, described images, and more.

Erlend Øverby of ISO/IEC JTC 1/SC 36 sends support of Accessibility Metadata to

image of the letter from ISO/IEC, including logo and signatureAs work on the accessibility metadata vocabulary finalizes and our submission to receives more attention, notable individuals, companies and standards groups are expressing their support. This paper letter from the chair of the technical committee for Information technology for learning, education and training at ISO/IEC was sent to one of our working group members, Jutta Treviranus to forward to the appropriate W3C mailing list. This blog gives a place for this information to be held and to be referenced in the email to the public vocabs list on behalf of Mr. Øverby. The letter follows.

Dear W3C members,

As Chair of ISO/IEC JTC 1/SC 36 I wish to express my strong support for the proposal to include accessibility metadata based on the AccessForAll or ISO/IEC 24751 standard within This highly successful multi-part standard is currently undergoing its first revision. The metadata expressed in the standard fills a critical gap for resource discovery and identification by anyone requiring alternative access systems or alternative resource presentations. The fact that the proposal is based on an established international standard supports interoperability and avoids fragmentation. We look forward to working with W3C to ensure that the proposal is successful and that the proposed additions meet their important goal.

Erlend Øverby
Chair ISO/IEC JTC 1/SC 36

The scanned copy of this letter, as we received it, is available in PDF format.
The ISO/IEC 24751 standard can be found at the Fluid project website.

This letter can be seen in the W3C public vocabs list as well.

Bookshare tags over 195,000 titles with accessibility metadata

Bookshare, the world’s largest online library of accessible eBooks for people with print disabilities, has just added accessibility metadata to its full library of over 195,000 titles. “We’ve had the information on content type, age ranges and available media forms locked away in our database, only useful to the most skilled internet searchers or through the search via our website or API.

Now, with the draft accessibility metadata semantic tags added, powered by microdata and, we can make this available in a manner that search engines can more easily find this information,” said Gerardo Capiel, VP of Engineering at Benetech. For example, the proof of concept Google Custom Search query below for history textbooks can be constrained to titles whose images have enhanced with rich descriptions by Bookshare volunteers:

Screen Shot of Google Custom Search Engine
You can do this yourself with two searches. The first is to search Bookshare for any titles involving history: that will return over 7,000 hits. The second, with just the titles that have alternative text for images or, better yet, long descriptions, is a search constrained by these mediafeatures: it returns just 14 items that are the higher value.

It was easy to scale this metadata conversion across all of Bookshare’s titles. Rather than requiring editorial work on each web page, these were database-generated web pages. The task was one of adding a few more tags or attributes around existing content. For example, for “World History Ancient Civilizations” above, elements like the title went from

<h1>World History: Ancient Civilizations</h1>


<h1 itemprop="name">World History: Ancient Civilizations</h1>

Other tags, such as

<meta itemprop="accessMode" content="textual"/>
<meta itemprop="mediaFeature" content="structuralNavigation"/>
<meta itemprop="mediaFeature" content="alternativeText"/>

were also added to tell the specific accessibility features of the book. More information on how to add accessibility metadata to your accessible content can be found on the resources page.