Your browser doesn't support javascript. This means that the content or functionality of our website will be limited or unavailable. If you need more information about Vinnova, please contact us.

Detecting hate in digital environments

Reference number
Coordinator TOTALFÖRSVARETS FORSKNINGSINSTITUT, FOI - Avdelningen för Försvars- och säkerhetssystem
Funding from Vinnova SEK 413 500
Project duration April 2019 - January 2020
Status Completed
Venture Challenge-Driven Innovation – Stage 1 initiation
Call Challenge-Driven Innovation stage 1 initiation – 2019

Purpose and goal

The goal of the project has been to develop a digital tool that automatically recognizes hate speech in Swedish and that helps to reduce hate speech in digital environments. An initial model has been developed but more tests and more annotated data are needed before a model with satisfactory performance is obtained.

Expected results and effects

Within the the project, we have increased our understanding of digital hate. Among other things, we have launched the website ´hatomaten´ where anyone could submit hateful texts to contribute to the development of our hate detection model. The aim with ´hatomaten´ was also to inform the initiative against online hate. Initial tests to understand our hate detection model´s ability to detect hate have been conducted by the Police Authority. The ambition is that the police will eventually be able to use similar techniques in their work with identifying threats.

Planned approach and implementation

FOI and Uppsala Universitet have worked on developing machine learning models that have been trained with annotated data to recognize hate. The police have tested one model and given feedback. In order to collect data that can be used to develop better models, the webpage hatomaten.se was also launched, where the public had the opportunity to assist the development of machine learning models by contributing with training data.

External links

The project description has been provided by the project members themselves and the text has not been looked at by our editors.

Last updated 6 March 2020

Reference number 2019-00934

Page statistics