I am a fourth year Ph.D. student in Computer Science at the University of Washington, advised by Andy Ko and James Fogarty. For Summer 2018, I was at Google for a research internship with Yang Li. In the past, I interned at Adobe Research Creative Technologies Lab working with Wilmot Li, Mira Dontcheva, Joel Brandt, and Morgan Dixon, on Rewire, an interactive design tool to help UX designers reuse example screenshots. I also spent 3 years working as a software engineer and SDET at Microsoft, where I worked on a web interface framework for Dynamics AX.
I am interested in data-driven design, creating tools to help designers adapt examples and explore variations, and ways that we can enhance, improve, and make interfaces more accessible without having access to the original application source code. My projects have involved building interactive systems towards these interests that apply a variety of techniques including program analysis and synthesis, computer vision, machine learning, and pixel-based reverse engineering. Here is a link to my current CV
Although the exploration of variations is a key part of interface design, current processes for creating variations are mostly manual. Scout system that helps designers explore many variations rapidly through mixed-initiative interaction with high-level constraints and design feedback. Scout allows designers to specfiy high-level constraints based on design concepts (e.g. emphasis). We have formalized several of these high-level constraints into their corresponding low-level spatial constraints to enable rapidly generating many designs through constraint solving and program synthesis.
Tapping is an immensely important gesture in mobile touchscreen interfaces, yet people still frequently are required to learn which elements are tappable through trial and error. Predicting human behavior for this everyday gesture can help mobile app designers understand an important aspect of the usability of their apps without having to run a user study. TapShoe is a deep learning model and approach for modeling the tappability of mobile interfaces at scale. For this project, we conducted large-scale data collection of tappability annotations using crowdsourcing and computationally investigated signifiers that people use to distinguish tappable versus not-tappable elements. We built a deep learning model to predict interface elements people will perceive as tappable and not tappable and created an interface that identifies mismatches between the predicted tappable state of an element and the actual tappable state in code.
Interface designers often use screenshot images of example designs as building blocks for new designs. Since images are unstructured and hard to edit, designers typically reconstruct screenshots with vector graphics tools in order to reuse or edit parts of the design. Unfortunately, this reconstruction process is tedious and slow. Rewire is an interactive system that helps designers leverage example screenshots. Rewire automatically infers a vector representation of screenshots where each UI component is a separate object with editable shape and style properties. Rewire provides three design assistance modes that help designers reuse or redraw components of the example design.
I built a system called CogTool-Helper that automatically infers a model of an interface and generates storyboards and cognitive models that allow UI designers to estimate human task performance in an interface. This system combines tools from software engineering for GUI testing (GUITAR) with CogTool, a system for human performance modeling.
Demos & Extended Abstracts
You can contact me at email@example.com, or find me in the Paul Allen Center for CSE, Room 605.