Everyday Sensing and Perception (ESP) project sought to develop
technology that can
infer a user’s context with 90% accuracy over 90% of their day. ESP
focused on the perception of
everyday situations that many context-aware applications depend on.
Specifically, ESP developed algorithms to infer:
Where is the user, in both absolute (latitude, longitude) as well as
symbolic (Grocery Store) terms?
What is the user doing right now in terms of physical (standing) and
object-based (washing dishes) activity?
What gesture is the user making with their hands, how are they standing
and what are they pointing at?
Who is the user interacting with and what role are they acting in
reach a 90% level of coverage, the ESP research approach
sensors integrated into a user’s mobile devices to sense their
environment and how they interact with it. ESP investigated
low-power, low data-rate sensors (e.g., RFID tags, accelerometers and
radios), as well as high data-rate sensors (e.g., video cameras and
microphones). To achieve 90% level of accuracy, ESP developed
algorithms employing joint
modeling of video and audio data with other worn sensors, on-the-fly
refinement of user models with online learning, parallelization
of machine learning algorithms and compressive
sensing and synopsis based reasoning for mobile devices.
ESP project also investigated a variety of applications and new
device form factors that are enabled by rich, continuous context
information. We specifically focused on systems that coupled
camera-based perception with projectors and tablet displays.