What would it mean when an electronic device knows more about your partners state than you do? Or can predict an incoming bout of misery through statistical analysis of accumulated data? When can technology become too invasive?
Happylife demonstrates a scary vision of what the collection and presentation of data can look like. The installation accumulates, processes and displays human feelings and by doing so, it furthermore learns to predict changes in mood and behavior. It’s a research project by EPSRC, Royal College of Art and NESTA and was part of the Impact exhibition, a collaboration between science and design that explores the importance of engineering and physical sciences in all aspects of our lives.
How it works
Real-time dynamic passive profiling technique will be based on the modelling of facial expressions, eye movement and pupil changes in both the visual and thermal domains and link these to malicious intent and physiological processes (such as blood flow, eye movement patterns, and pupil dilation). To facilitate this process, one of the initial aspects of the project is the collection, analysis and development of the dataset used to model the baseline of facial imagery behaviour of the general population against which physiological behaviours in people with malicious intent would need to be detected. Both the baseline and the dynamic profiling will be based on the response to a series of questions.
The data is represented on a visual display linked to the thermal image camera. This employs facial recognition to differentiate between members of the family. Each member has one rotary dial and one RGB LED display effectively acting like emotional barometers. These show current state and predicted state, the predicted state being based on years of accumulated statistical data.
There are no revisions for this post.