Leah Perlmutter (she/her)

About me

I am a PhD student studying Human-Robot Interaction at the University of Washington's Paul G. Allen School of Computer Science and Engineering. I am currently on the job market looking for a one-year teaching faculty or teaching postdoc role beginning in autumn 2022. Next year I will be searching for a permanent teaching role in conjunction with my partner's faculty job search.

I started my PhD work in 2016. My advisor is Maya Cakmak and I work in the Human-Centered Robotics Lab. In 2017 I was awarded an NSF Graduate Research Fellowship.

I am passionate about building communities of people of diverse genders in computer science. In 2018 the UW Society of Women Engineers recognized me with the Outstanding Female Engineer Award for my work in organizing women's events for undergraduate and graduate students.

In 2020, I passed my Qualifying Examination, a major milestone towards my PhD, and earned my Master's degree in Computer Science and Engineering.

In 2021, was accepted into Cohort 2 of the Cultural Competence in Computing (3C) Fellows Program, run by Dr. Nicki Washington and others in the Duke University Identity in Computing Lab. 3C is a 2 year program where I am studying to figure out how I can address systemic racism in undergraduate CS education systems during my future career as a teaching professor.

I am a proud member of UAW 4121 where I am an elected steward and volunteer organizer.

Teaching

TA for CSE 374: Intermediate Programming Concepts and Tools, University of Washington Allen School of Computer Science and Engineering Autumn 2021 (Kasey Champion), Spring 2021 (Megan Hazen), Autumn 2020 (Kasey Champion)

Instructor for CSE 331: Software Design and Implementation, University of Washington Allen School of Computer Science and Engineering Summer 2018

TA for CSE 331: Software Design and Implementation, University of Washington Allen School of Computer Science and Engineering Summer 2021 (Ardi Madadi), Spring 2018 (Zach Tatlock)

TA for CSE 143: Introduction to Computer Programming II (CS 1.5), University of Washington Allen School of Computer Science and Engineering Winter 2021 (Brett Wortzman)

Lab TA, Colby Computer Science Department, September 2011 - May 2012

Contact

leahperl AT uw DOT edu

Publications

Bindita Chaudhuri, Leah Perlmutter, Justin Petelka, Philip Garrison, James Fogarty, Jacob O. Wobbrock, and Richard E. Ladner. 2019. “GestureCalc: An Eyes-Free Calculator for Touch Screens.” In ASSETS 2019, 112–23. Pittsburgh, PA, USA: ACM. doi:10.1145/3308561.3353783. [PDF] [Tutorials: Digits (visual), Operators (visual), Digits and Operators (text-based)]

Leah Perlmutter, Bindita Chaudhuri, Justin Petelka, Philip Garrison, James Fogarty, Jacob O. Wobbrock, and Richard E. Ladner. “Demonstration of GestureCalc: An Eyes-Free Calculator for Touch Screens.” In ASSETS 2019, 667–69. Pittsburgh, PA: ACM, 2019. doi: 10.1145/3308561.3354595. [PDF] [Tutorials: Digits (visual), Operators (visual), Digits and Operators (text-based)]

Thomas Weng, Leah Perlmutter, Stefanos Nikolaidis, Siddhartha Srinivasa, and Maya Cakmak. “Robot Object Referencing through Legible Situated Projections.” In ICRA 2019, 8004–10. doi: 10.1109/ICRA.2019.8793638. [PDF]

Leah Perlmutter, Alex Fiannaca, Eric Kernfeld, Sahil Anand, Lindsey Arnold, and Maya Cakmak. “Automatic Adaptation of Online Language Lessons for Robot Tutoring.” In ICSR 2016. Kansas City: Springer, 2016. [PDF]

Leah Perlmutter, Eric Kernfeld, and Maya Cakmak. “Situated Language Understanding with Human-like and Visualization-Based Transparency.” In Proceedings of Robotics: Science and Systems. Ann Arbor, Michigan, 2016. doi:10.15607/RSS.2016.XII.040. [PDF]

Daniel A. Lazewatsky, Bogo Giertler, Martha Witick, Leah Perlmutter, Bruce A. Maxwell, and William D. Smart. “Context-Aware Video Compression for Mobile Robots.” In IROS 2011, 4115–20, 2011. doi:10.1109/IROS.2011.6094996. [HTML] [PDF]

Bruce A. Maxwell, Brian M. Leighton, and Leah R. Perlmutter. “A Responsive Vision System to Support Human-Robot Interaction.” presented at the Humanoids Design Architecture and Human Robot Interaction Workshop, US-Korea Conference, 2009. [HTML] [PDF]

Research

Computer Science Education

I qualitatively study the impacts of resubmission opportunities on students taking introductory computer science courses at UW. I see resubmission opportunities as a way to focus on student learning rather than on the mistakes they make while learning. My collaborators and I presented a poster at the 2021 UW Teaching and Learning Symposium. If you want to know more about my work on resubmissions, ask to see my (as yet) unpublished manuscript!

I am also interested in justice-centered approaches to teaching.

Text reading 'CSE 143'

EMAR

This project is about using social robots to help teens get through stress. My contribution involves using the EMAR robot to help teens develop their emotional clarity skills. More info on my collaborator's website.

Robot EMAR is a boxlike tabletop robot with two screens, the top one showing EMAR's face and the bottom one showing a blue button. Two high school girls and a researcher are interacting with Robot EMAR that sits on the table in front of them in a high school classroom.

GestureCalc

An eyes-free, target free touch screen calculator.

Publication: GestureCalc: An Eyes-Free Calculator for Touch Screens

Gesture Set Tutorials: Digits (visual), Operators (visual), Digits and Operators (text-based).

Below is a video demonstration of doing arithmetic calculations in GestureCalc. Input is entered using taps and swipes that can be performed at any location on the screen.

The next video demonstrates using a typical touch screen calculator with a screen reader. It's the baseline that we compared to GestureCalc in our study.

Transparency for Human-Robot Interaction

Publication: Situated Language Understanding with Human-like and Visualization-Based Transparency

Kubi (Language Teaching Robot)

Publication: Automatic Adaptation of Online Language Lessons for Robot Tutoring

Deictic Gesture Understanding

This project was about trying to get robots to understand pointing (deictic) gestures. I worked on it for the better part of 2 years, but in the end failed to accomplish that and even failed to publish a manuscript about the dataset that I collected. I've shared it on my website as a reminder that failed research projects are really common! People just don't tend to talk about them much.

A cat sized humanoid Nao robot sits on a cart with a kinect sensor above its head. A man points at a bookshelf while looking at the robot.