In 2012, NCAM and IBM-Research Tokyo launched a joint project to study the implementation and usefulness of text-to-speech audio descriptions: descriptions generated by computers rather than recorded by humans. The results of this research helped inform the creation of new tools and approaches for creating audio descriptions for on-line videos.
Where accessibility solutions already exist, NCAM can help you determine the best approach for implementing available techniques for making digital materials of all kinds as accessible as possible. But NCAM also has more than 25 years of experience in research and development of custom approaches to accessibility. Whether your accessibility needs involve Web sites, on-line assessments, custom media players, captions and audio descriptions, or accessibility in public spaces, NCAM can put years of experience to use when helping create a custom solution for you. Here is just a handful of the things we've invented:
A tool designed for broadcast-level use, ccWebCaster reuses existing closed captions (both real-time and post-production) for online streaming videos. ccWebCaster takes the captions from a video-broadcast signal at the time of air, then converts them into a format that can be used to accompany video that is streamed to a Web site or any video-player application. You can see ccWebCaster in action on WTJX's live-streaming video page.
Media Access Mobile solves the problem of providing captions and audio descriptions to users in large public spaces. MAM provides synchronized text (captions, subtitles or audio descriptions) in any combination of languages, all provided simultaneously, over a WiFi network for display on handheld devices (phones or tablets). Rather than watch captions or listen to audio descriptions on large monitors or televisions which may be difficult to see and hear in public arenas, MAM provides these accessibility features in the palm of users' hands, on familiar devices.
NCAM partnered with NASA’s Heliophysics Education Consortium and the Harvard-Smithsonian Center for Astrophysics Eclipse Soundscapes to provide an accessible smart phone app for the “Great American Eclipse” of August 21, 2017. The Eclipse Soundscapes app, included a narration of the eclipse’s progression in real time using specialized imagery description techniques developed by NCAM. NCAM also conducted a review of the Eclipse Soundscapes app to enhance its accessibility, including unique features such as a “rumble map” which provides a haptic, audio, and visual tour of high-resolution photographs of various stages of a full eclipse. '
Learn more about this project from WGBH News.
This comprehensive and unique resource can be used to train item writers on accessibility basics with examples and instructions. Focusing on item construction and the use of images, these guidelines will help reduce the need for additional work to make items accessible for students who use assistive technology. By addressing the decisions that test item writers face while they are actually creating the test, the NCAM Item Writer Guidelines explain how to create items that minimize the need for text alternatives. The Guidelines also provide examples of small changes that will help ensure clarity for students using assistive technology and, potentially, produce better items for all students.
Text-to-Speech Audio Descriptions
Accessible, Synchronized Enhancements for Online Video
Our successes in implementing accessibility would not have been possible without the partnership and guidance of NCAM. They partnered with us to work through issues both technical and philosophical, with flexibility and sensitivity to our organizational desires and limitations.Bryan Goodwin, Analytics and Process Manager, Content Solutions, NWEA