Leah Perlmutter

About me

I am a PhD student studying Human-Robot Interaction at the University of Washington's Paul G. Allen School of Computer Science and Engineering. I started my PhD work in 2016. My advisor is Maya Cakmak and I work in the Human-Centered Robotics Lab. In 2017 I was awarded an NSF Graduate Research Fellowship.

I am passionate about building communities of women in computer science. In 2018 the UW Society of Women Engineers recognized me with the Outstanding Female Engineer Award for my work in organizing women's events for undergraduate and graduate students.

In 2020, I passed my Qualifying Examination, a major milestone towards my PhD, and earned my Master's degree in Computer Science and Engineering.

I am a proud member of UAW 4121 where I am a steward and volunteer organizer.

Teaching

TA for CSE 374: Intermediate Programming Concepts and Tools, University of Washington Allen School of Computer Science and Engineering Autumn 2020

Instructor for CSE 331: Software Design and Implementation, University of Washington Allen School of Computer Science and Engineering Summer 2018

TA for CSE 331: Software Design and Implementation, University of Washington Allen School of Computer Science and Engineering Spring 2018

Lab TA, Colby Computer Science Department, September 2011 - May 2012

Contact

leahperl AT uw DOT edu

Publications

Bindita Chaudhuri, Leah Perlmutter, Justin Petelka, Philip Garrison, James Fogarty, Jacob O. Wobbrock, and Richard E. Ladner. 2019. “GestureCalc: An Eyes-Free Calculator for Touch Screens.” In ASSETS 2019, 112–23. Pittsburgh, PA, USA: ACM. doi:10.1145/3308561.3353783. [PDF] [Tutorials: Digits (visual), Operators (visual), Digits and Operators (text-based)]

Leah Perlmutter, Bindita Chaudhuri, Justin Petelka, Philip Garrison, James Fogarty, Jacob O. Wobbrock, and Richard E. Ladner. “Demonstration of GestureCalc: An Eyes-Free Calculator for Touch Screens.” In ASSETS 2019, 667–69. Pittsburgh, PA: ACM, 2019. doi: 10.1145/3308561.3354595. [PDF] [Tutorials: Digits (visual), Operators (visual), Digits and Operators (text-based)]

Thomas Weng, Leah Perlmutter, Stefanos Nikolaidis, Siddhartha Srinivasa, and Maya Cakmak. “Robot Object Referencing through Legible Situated Projections.” In ICRA 2019, 8004–10. doi: 10.1109/ICRA.2019.8793638. [PDF]

Leah Perlmutter, Alex Fiannaca, Eric Kernfeld, Sahil Anand, Lindsey Arnold, and Maya Cakmak. “Automatic Adaptation of Online Language Lessons for Robot Tutoring.” In ICSR 2016. Kansas City: Springer, 2016. [PDF]

Leah Perlmutter, Eric Kernfeld, and Maya Cakmak. “Situated Language Understanding with Human-like and Visualization-Based Transparency.” In Proceedings of Robotics: Science and Systems. Ann Arbor, Michigan, 2016. doi:10.15607/RSS.2016.XII.040. [PDF]

Daniel A. Lazewatsky, Bogo Giertler, Martha Witick, Leah Perlmutter, Bruce A. Maxwell, and William D. Smart. “Context-Aware Video Compression for Mobile Robots.” In IROS 2011, 4115–20, 2011. doi:10.1109/IROS.2011.6094996. [HTML] [PDF]

Bruce A. Maxwell, Brian M. Leighton, and Leah R. Perlmutter. “A Responsive Vision System to Support Human-Robot Interaction.” presented at the Humanoids Design Architecture and Human Robot Interaction Workshop, US-Korea Conference, 2009. [HTML] [PDF]

Research

EMAR

This project is about using social robots to help teens get through stress. My contribution involves using the EMAR robot to help teens develop their emotional clarity skills. More info on my collaborator's website.

Two high school girls and a researcher interacting with Robot EMAR that sits on the table in front of them in a high school classroom. Robot EMAR is a boxlike tabletop robot with two screens, one showing EMAR's face and the other showing a blue button with text too small to read. Students use computers in the background.

GestureCalc

An eyes-free, target free touch screen calculator.

Publication: GestureCalc: An Eyes-Free Calculator for Touch Screens

Gesture Set Tutorials: Digits (visual), Operators (visual), Digits and Operators (text-based).

Below is a video demonstration of doing arithmetic calculations in GestureCalc. Input is entered using taps and swipes that can be performed at any location on the screen.

The next video demonstrates using a typical touch screen calculator with a screen reader. It's the baseline that we compared to GestureCalc in our study.

Transparency for Human-Robot Interaction

Publication: Situated Language Understanding with Human-like and Visualization-Based Transparency

Kubi (Language Teaching Robot)

Publication: Automatic Adaptation of Online Language Lessons for Robot Tutoring

Deictic Gesture Understanding

This project was about trying to get robots to understand pointing (deictic) gestures. I worked on it for the better part of 2 years, but in the end failed to accomplish that and even failed to publish a manuscript about the dataset that I collected. I've shared it on my website as a reminder that failed research projects are really common! People just don't tend to talk about them much.

A Nao robot sits on a cart with a kinect sensor above its head. A man points at a bookshelf while looking at the robot.