Your browser doesn't support javascript. This means that the content or functionality of our website will be limited or unavailable. If you need more information about Vinnova, please contact us.

Site Specific Forestry AI

Reference number
Coordinator Deep Forestry AB
Funding from Vinnova SEK 270 000
Project duration June 2020 - May 2021
Status Completed
Venture Innovative Startups
Call Innovative Startups step 1 spring 2020

Important results from the project

** Denna text är maskinöversatt ** Projektet Site Specific Forestry AI performed with results that exceeded expectations. A system for annotating data with a handheld department and assigning attributes to objects of interest was developed. Tests show good results with attributes assigned to objects for e.g. natural values. The data was used in the extension to train the Deep Forestrys SeMaFore AI algorithm, which with this enhanced learning is trained for new classes with a fraction of the previous amount of data that was necessary.

Expected long term effects

** Denna text är maskinöversatt ** The outcome of the project was above expectations positive. The project´s set objective to facilitate data collection and marking of data was achieved and an operational "AI Training Data Acquisition Device" was developed. This is a groundbreaking way to collect and annotate data on which will accelerate development multiple times. Now you can use existing skills in forestry for to develop new digitized tools and quantified methods.Now the industry gets new opportunities to measure parameters that previously could only be estimated subjectively.

Approach and implementation

** Denna text är maskinöversatt ** In close collaboration with Deep Forestrys´s former partners from academia, the project was carried out with a high level of collaboration where the cutting edge of research in computer vision and machine learning was combined with Deep Forestrys´s expertise in digitalisation of forestry.

The project description has been provided by the project members themselves and the text has not been looked at by our editors.

Last updated 29 June 2021

Reference number 2020-01132