I am a third year Ph.D. student in Computer Science at the University of Washington, advised by Andy Ko and James Fogarty. I recently completed a second internship at Adobe Research Creative Technologies Lab working with Wilmot Li, Mira Dontcheva, Joel Brandt, and Morgan Dixon working on interactive tools to help UX designers reuse example screenshots. I also spent 3 years workin as a software engineer and SDET at Microsoft, where I worked on building a new cloud-based web client for Dynamics AX.

I am interested in data-driven design, and ways that we can enhance, improve, and make interfaces more accessible without having access to the original application source code. Here is a link to my current CV

Current Research

Rewire: Interface Design Assistance from Examples

Interface designers often use screenshot images of example designs as building blocks for new designs. Since images are unstructured and hard to edit, designers typically reconstruct screenshots with vector graphics tools in order to reuse or edit parts of the design. Unfortunately, this reconstruction process is tedious and slow. Rewire is an interactive system that helps designers leverage example screenshots. Rewire automatically infers a vector representation of screenshots where each UI component is a separate object with editable shape and style properties. Rewire provides three design assistance modes that help designers reuse or redraw components of the example design.

Genie: Input Retargeting on the Web through Command Reverse Engineering

I created an abstract model of a command and a set of methods for reverse engineerings commands and command metadata from arbitrary web applications for the purposes of command monitoring and retargeting inputs to alternate modalities (e.g., Retargeting a web application built only for mouse input to have audio-controlled commands). The system uses JavaScript static and dynamic program analysis to discover commands and monitor their status, and is built in a Chrome Extension. I am working on creating an open source version of the tool and will be making it available soon (TBA).

Past Research

CogTool-Helper: Generating Predictive Human Performance Models from Interfaces

I built a system called CogTool-Helper that automatically infers a model of an interface and generates storyboards and cognitive models that allow UI designers to estimate human task performance in an interface. This system combines tools from software engineering for GUI testing (GUITAR) with CogTool, a system for human performance modeling.


Conference Publications


Contact Me

You can contact me at amaswea@cs.washington.edu, or find me in the Paul Allen Center for CSE, Room 510.