Multi Sensor Image-Based Navigation
Reference number | |
Coordinator | Saab AB - SAAB Aktiebolag Aeronautics |
Funding from Vinnova | SEK 2 400 000 |
Project duration | November 2020 - April 2024 |
Status | Completed |
Venture | Sweden Brazil Innovation Initiative |
Important results from the project
** Denna text är maskinöversatt ** The goal of the project was to develop methods and algorithms to support inertial navigation with camera information. Information from the camera, which in this case is optical flow, is produced using machine learning (ML) methods in the form of Deep Neural Networks (DNN). The developed methods were planned to be evaluated on real flight test data.
Expected long term effects
** Denna text är maskinöversatt ** The project resulted in developed algorithms and methods for sensor fusion in aim of locating flying platforms. Results have also been published at international conferences in the form of conference articles. The evaluation of the developed methods on the real flight test data is postponed due to the lack of time within the project. This can be implemented within Saab´s own activities in the future.
Approach and implementation
** Denna text är maskinöversatt ** The project has been carried out through research carried out by a post-doctorate, Jeongmin Kang, under the supervision of people from Linköping University and Saab. The research has involved both developing algorithms and methods to solve the localization task and carrying out a comparison of machine learning methods with the traditional ones, as well as carrying out a State-of-the-Art study in the field. Evaluation of the developed methods has been carried out on public data.