Danielle Bragg
About Research Honors Publications

Danielle Bragg

PhD Student
Computer Science & Engineering
University of Washington
dkbragg [at] cs.washington.edu


ABOUT ME


I am a PhD candidate in Computer Science at the University of Washington advised by Richard Ladner. My research interests combine Accessibility, Human-Computer Interaction, and Applied Machine Learning. In my research, I take data-driven approaches to address accessibility problems, helping to make the world a more equitable place for people with disabilities.

My diverse past research projects have spanned data visualization, computational biology, computer music, applied mathematics, and network protocols.

EXPERIENCE

Current PhD Student, UW CSE
2015, 2016 Intern, Microsoft Research New England
2014 Intern, Microsoft Bing
2011-2012 PhD Student, Princeton
2010-2011 Research Assistant, George Washington University
2010 AB, Applied Mathematics, cum laude, Harvard

For more details, please see my CV.

RESEARCH


My thesis work makes visual language more available to sign language users and low-vision readers, allowing marginalized groups to "plug into" existing communication infrastructure. As interactive technologies become increasingly rich, multi-modal, and pervasive, ensuring accessibility becomes increasingly important and challenging. At the same time, new technologies offer new capabilities that can improve accessibility, which I capitalize on in my work. In particular, I make use of crowdsourced perceptual data to power solutions to accessibility challenges.

The word livefonts in several scripts, including an animated script

SMARTFONTS & LIVEFONTS

By redesigning English letterforms, smartfonts and livefonts challenge our assumption that text should be rendered in traditional letterforms on modern screens. Personal devices have made this challenge possible, by allowing users to adopt new character systems without language reform or mass adoption. Smartfonts can be installed and integrated into existing software systems, e.g., as font files, allowing individuals to change their text displays without impacting anybody else's reading experience. While smartfont designs leverage color, shape, and spacing capabilities of modern fonts, livefonts add animation to the design space. Potential benefits include increased legibility, increased privacy, aesthetics, and fun. Try it out by downloading a smartfont and uploading it to the Font Changer Chrome browser extension, to render all your browser content in a smartfont!
Workflow of ASL-Search

ASL-SEARCH - (release soon)

ASL-Search is an American Sign Language (ASL) dictionary that lets students look up the English meanings of signs. Looking up the meaning of a sign is difficult because a sign is a 3D movement not easily described with written words. Our dictionary lets users enter a set of features including hand shape, location, and movement, to describe a sign and look it up. The dictionary learns from the features that previous users enter to improve results for future users.
Workflow of ASL-Search

ASL-FLASH - (www.aslflash.org)

ASL-Flash is a site that both helps people learn American Sign Language and provides featural descriptions of signs. The site provides "flashcards" of signs, showing visitors videos of signs and quizzing them on the English meanings and compositional features (e.g., handshape and hand location). The data that users provide helps us build the ASL-Search dictionary. Check it out at www.aslflash.org and learn some signs!
Screen shot of the study, with the words 'Press play when you are ready' and a large play button

LISTENING RATES

We provide the first inclusive, large-scale study on human listening rates. As conversational agents and digital assistants become increasingly pervasive, understanding their synthetic speech becomes increasingly important. Speech synthesis is also becoming more sophisticated, providing the opportunity to optimize speech rate to save users time. Run on LabintheWild, our study used volunteer participants, and was fully accessible. Our results inform synthetic speech rate optimization and future inclusive crowdsourced studies.
Screen shot of the sound detector app, listing sounds of interest to the user

SOUND DETECTOR

The sound detector is a trainable app that alerts users to sounds of interest (e.g., a door knock, appliance running, or alarm ringing). Sounds provide important information, and non-auditory cues are not always available. In these situations, a sound detector can be useful to deaf or hard-of-hearing people. Our mobile app design provides personalized sound awareness through a ubiquitous device. The user records examples of sounds, and the app notifies the user when they occur.

HONORS


PUBLICATIONS (Google Scholar)


  1. D. Bragg, C. Bennett, K. Reinecke, R. Ladner. "A Large Inclusive Study of Human Listening Rates." Proc. CHI 2018. (paper, to appear)
  2. D. Bragg, S. Azenkot, K. Larson, A. Bessemans, A. Kalai. "Designing and Evaluating Livefonts." Proc. UIST. Quebec, Canada. October 2017. (paper) <--- NOTE: Open with Adobe Reader to see animations in the PDF itself!
  3. D. Bragg, S. Azenkot, A. Kalai. "Reading and Learning Smartfonts." Proc. UIST. Tokyo, Japan. October 2016. (paper)
  4. D. Bragg, N. Huynh, R. Ladner. "A Personalizable Mobile Sound Detector App Design for Deaf and Hard-of-Hearing Users." Proc. ASSETS. Reno, NV. October 2016. (paper)
  5. D. Bragg, K. Rector, R. Ladner. "A User-Powered American Sign Language Dictionary." Proc. CSCW. Vancouver, Canada. March 2015. (paper)
  6. D. Bragg. “Synchronous Data Flow Modeling for DMIs.” Proc. NIME. Daejeon and Seoul, Republic of Korea. May 2013. (paper)
  7. M. Yun, D. Bragg, A. Arora, and H.A. Choi. “Battle Event Detection Using Sensor Networks and Distributed Query Processing.” Proc. Computer Communications Workshops (IEEE INFOCOM). Pages 750-755. Shanghai, China. April 2011. (paper)
  8. Y. Zhou, D. Bragg, M. Yun, and H.A. Choi. “On Data Transmission Scheduling considering Switching Penalty in Mobile Sensor Networks.” Proc. Computer Communications Workshops (IEEE INFOCOM). Pages 774-779. Shanghai, China. April 2011. (paper)
  9. D. Bragg, “Quantification and Display of Emotions in Music.” Honors Senior Thesis. Harvard University Department of Applied Mathematics. June 2010. (undergraduate thesis)
  10. Y. Yang, A. Chow, L. Golubchik, and D. Bragg. “Improving QoS in BitTorrent-like VoD Systems.” Proc. IEEE INFOCOM. Pages 1-9. San Diego, California. March 2010. (paper)

CONTACT DANIELLE

Click to email me.