emotIOn Logging.gif

emotIOn 2.0

emotion2.png

emotIOn 2.0

2019

Sensor Fusion For Classification of Human Affective States

emotIOn 2.0 is the second version of the emotIOn system. The projects goal is to provide a framework for classifying human affective states by using a wearable collection and labeling system.

What makes up an affective state?

Affective states are a mixture of an infinite range of variables that can be mapped into the four presented categories (Environmental State, Memories, Social Context and Bio Metrics). The presented model combines inputs environmental and bio-metric variables to classify specific affective states. Incorporation of more variables into the system would allow for a stringer model.

Data Acquisition System

One of the main design goals for the project was to develop an acquisition system that would allow users to use it on a normal day basis. This approach enables us to continuously collect data. Affective states vary naturally within a normal day, therefore experimentation had to be carried out with a “living laboratory” approach which allowed the system to collect a subject’s data throughout long time lapses (4-8 hours).

How do users guide themselves to label their emotions?

It is difficult for most people to understand and label their affective states. To overcome difficulties that self labeling can cause, the mobile application interface was based on the Circumplex Model of Affect (Model is widely used by researchers focusing on affective states) which allows a user to locate and discretize their current affective state by answering two simple questions: ¿Are you calm or agitated? and ¿ Is your feeling pleasant or unpleasant?

Answers were recorded by the mobile app and used to label incoming data from the environmental sensor for future processing.

Mobile App

The mobile app was used by users to record affective states as they used the wearable environmental sensor. The data was labeled “live” by each user as they used the device. The mobile app was created using Android Studio and is available on the Google Play Store.

emotIOn App Interface design for input of Affective states.

emotIOn App Interface design for input of Affective states.

Feed Forward Neural Network Architecture

Results

Experiments were conducted with six different users of the system. Each user wore the system for different time intervals in a continuous manner. The model is able to classify unseen affective states with an accuracy of up to 95%. Future improvements to the system are the expansion of measured variables so that they include more variables representing Bio-Metrics and Social Context.


Publication

Work done with the emotIOn 1.0 and emotIOn 2.0 was presented at MICAI (Mexican International Conference on Artificial Intelligence 2019) and published on Springer’s Lecture Notes on Computer Science - Advances in Soft Computing.

Rico A., Garrido L. (2019) Feed Forward Classification Neural Network for Prediction of Human Affective States Using Continuous Multi Sensory Data Acquisition. In: Martínez-Villaseñor L., Batyrshin I., Marín-Hernández A. (eds) Advances in Soft Computing. MICAI 2019. Lecture Notes in Computer Science, vol 11835. Springer, Cham

if interested on the technical details of this work, click HERE.

Publication is available upon request.

978-3-030-33749-0.jpg

emotIOn 1.0

emotIOn 1.0 was the first iteration of this project. Key differences between versions are the differentiation of emotional states vs affective states, the integration of a mobile app for scaling user tests and the incorporation of the circumplex model of affect to evaluate an individuals affective states as they use the system.

Learn more about emotIOn 1.0 by clicking HERE!


Credits and Acknowledgements

Carson Smuts (terMITe development) | Jason Nawyn (Project’s Critique) | Leonardo Garrido (Machine Intelligence Advisor)