Your browser doesn't support javascript. This means that the content or functionality of our website will be limited or unavailable. If you need more information about Vinnova, please contact us.

Automatized AI-based analysis of non-verbal communication signals in interpersonal communication

Reference number
Coordinator LENNART HÖGMAN AB
Funding from Vinnova SEK 298 133
Project duration March 2019 - November 2019
Status Completed
Venture Innovative Startups

Purpose and goal

The market for emotions-AI is expected (E-AI) 2023 globally to reach $ 25-50 billion. We have developed a new type of E-AI that specializes in analyzes of non-verbal interpersonal interaction. This type of AI has the potential to radically change how non-verbal signals are analyzed and evaluated The product can be used for research, education, healthcare, also for the entertainment industry. The product has been trained to identify pro-social signals and signs It can also be used for aspects such as dominance, or the presence of masked emotional signals.

Expected results and effects

We expect the system to be used in a number of different areas, partly in basic research, but also for example in education, psychotherapy, psychiatric investigations, employment interviews, in negotiations but in the long term also for the computer games and entertainment industry.

Planned approach and implementation

The system today consists of four parts (1) a camera and microphone system that enables synchronized recording of two or more people interacting. (2) A system for the analysis of emotional non-verbal signals. (3) A system for annotating and establishing ground truth. (4) A system of neural networks trained to classify non-verbal interaction patterns. The training of the network models takes place both on theoretical grounds and on the basis of annotations made by trained psychologists.

External links

The project description has been provided by the project members themselves and the text has not been looked at by our editors.

Last updated 24 January 2020

Reference number 2019-00309

Page statistics