I am a PhD candidate in Computer Science at the University of Washington advised by Richard Ladner. My research interests combine Accessibility, Human-Computer Interaction, and Applied Machine Learning. In my research, I take data-driven approaches to address accessibility problems, helping to make the world a more equitable place for people with disabilities.
My diverse past research projects have spanned data visualization, computational biology, computer music, applied mathematics, and network protocols.
SMARTFONTS & LIVEFONTSBy redesigning English letterforms, smartfonts and livefonts challenge our assumption that text should be rendered in traditional letterforms on modern screens. Personal devices have made this challenge possible, by allowing users to adopt new character systems without language reform or mass adoption. Smartfonts can be installed and integrated into existing software systems, e.g., as font files, allowing individuals to change their text displays without impacting anybody else's reading experience. While smartfont designs leverage color, shape, and spacing capabilities of modern fonts, livefonts add animation to the design space. Potential benefits include increased legibility, increased privacy, aesthetics, and fun. Try it out by downloading a smartfont and uploading it to the Font Changer Chrome browser extension, to render all your browser content in a smartfont!
ASL-SEARCH - (release soon)ASL-Search is an American Sign Language (ASL) dictionary that lets students look up the English meanings of signs. Looking up the meaning of a sign is difficult because a sign is a 3D movement not easily described with written words. Our dictionary lets users enter a set of features including hand shape, location, and movement, to describe a sign and look it up. The dictionary learns from the features that previous users enter to improve results for future users.
ASL-FLASH - (www.aslflash.org)ASL-Flash is a site that both helps people learn American Sign Language and provides featural descriptions of signs. The site provides "flashcards" of signs, showing visitors videos of signs and quizzing them on the English meanings and compositional features (e.g., handshape and hand location). The data that users provide helps us build the ASL-Search dictionary. Check it out at www.aslflash.org and learn some signs!
LISTENING RATESWe provide the first inclusive, large-scale study on human listening rates. As conversational agents and digital assistants become increasingly pervasive, understanding their synthetic speech becomes increasingly important. Speech synthesis is also becoming more sophisticated, providing the opportunity to optimize speech rate to save users time. Run on LabintheWild, our study used volunteer participants, and was fully accessible. Our results inform synthetic speech rate optimization and future inclusive crowdsourced studies.
SOUND DETECTORThe sound detector is a trainable app that alerts users to sounds of interest (e.g., a door knock, appliance running, or alarm ringing). Sounds provide important information, and non-auditory cues are not always available. In these situations, a sound detector can be useful to deaf or hard-of-hearing people. Our mobile app design provides personalized sound awareness through a ubiquitous device. The user records examples of sounds, and the app notifies the user when they occur.
PUBLICATIONS (Google Scholar)
- D. Bragg, C. Bennett, K. Reinecke, R. Ladner. "A Large Inclusive Study of Human Listening Rates." Proc. CHI 2018. (paper, to appear)
- D. Bragg, S. Azenkot, K. Larson, A. Bessemans, A. Kalai. "Designing and Evaluating Livefonts." Proc. UIST. Quebec, Canada. October 2017. (paper) <--- NOTE: Open with Adobe Reader to see animations in the PDF itself!
- D. Bragg, S. Azenkot, A. Kalai. "Reading and Learning Smartfonts." Proc. UIST. Tokyo, Japan. October 2016. (paper)
- D. Bragg, N. Huynh, R. Ladner. "A Personalizable Mobile Sound Detector App Design for Deaf and Hard-of-Hearing Users." Proc. ASSETS. Reno, NV. October 2016. (paper)
- D. Bragg, K. Rector, R. Ladner. "A User-Powered American Sign Language Dictionary." Proc. CSCW. Vancouver, Canada. March 2015. (paper)
- D. Bragg. “Synchronous Data Flow Modeling for DMIs.” Proc. NIME. Daejeon and Seoul, Republic of Korea. May 2013. (paper)
- M. Yun, D. Bragg, A. Arora, and H.A. Choi. “Battle Event Detection Using Sensor Networks and Distributed Query Processing.” Proc. Computer Communications Workshops (IEEE INFOCOM). Pages 750-755. Shanghai, China. April 2011. (paper)
- Y. Zhou, D. Bragg, M. Yun, and H.A. Choi. “On Data Transmission Scheduling considering Switching Penalty in Mobile Sensor Networks.” Proc. Computer Communications Workshops (IEEE INFOCOM). Pages 774-779. Shanghai, China. April 2011. (paper)
- D. Bragg, “Quantification and Display of Emotions in Music.” Honors Senior Thesis. Harvard University Department of Applied Mathematics. June 2010. (undergraduate thesis)
- Y. Yang, A. Chow, L. Golubchik, and D. Bragg. “Improving QoS in BitTorrent-like VoD Systems.” Proc. IEEE INFOCOM. Pages 1-9. San Diego, California. March 2010. (paper)