Danielle BraggAug. 22, 2018, 15:00
MTA SZTAKI (Lágymányosi u. 11, Budapest) Room 306
Computer scientists and linguists have greatly improved our understanding and access of spoken and written language: curating large datasets, analyzing language structure, developing machine learning, and designing powerful interfaces. However, many modern technologies still do not work well for people with disabilities, who total a billion worldwide (and nearly one in five in the US). For example, small text may exclude people with visual impairments, and text-based resources like search engines and text editors may not fully support people using sign languages, which do not have a standard written form.
In this talk, I will present three systems that expand and enrich access to language: 1) Smartfonts and Livefonts, scripts that reimagine the alphabet’s appearance to improve legibility and enrich text displays, 2) Animated Si5s, the first animated reading/writing system for American Sign Language (ASL), and 3) ASL-Search, an ASL dictionary trained on crowdsourced feature data from volunteer ASL students. These systems employ quantitative methods, using large-scale data collected through existing and novel crowdsourcing platforms to solve data scarcity problems and explore design spaces. They also use human-centered approaches to better understand and address usability problems.
Danielle Bragg is a computer scientist and postdoctoral researcher at Microsoft Research. Her research focuses on building systems that improve access to information by leveraging modern computing capabilities in innovative ways. Her research interests combine human-computer interaction, accessibility, and applied machine learning. Her diverse past projects span computer music, data visualization, computational biology, applied mathematics, and network protocols. She holds a PhD and MS in Computer Science from the University of Washington advised by Richard Ladner, and an AB in Applied Mathematics from Harvard University.