Your browser doesn't support javascript. This means that the content or functionality of our website will be limited or unavailable. If you need more information about Vinnova, please contact us.

Live 3D: Real-time Digital Twins of Dynamic Events

Reference number
Coordinator I-CONIC Vision AB
Funding from Vinnova SEK 996 533
Project duration November 2024 - June 2025
Status Ongoing
Venture Acceleration of deep tech companies
Call Acceleration of deep tech companies 2024

Purpose and goal

The overall aim of the project is to create Live 3D, a complete system for 3D models in real time. Unlike current solutions that require complex systems in dynamic environments, Live 3D delivers a cost-effective, flexible and fast solution, offering measurable 3D models of both static and dynamic objects. In this first phase, we want to validate the technology of using multiple drones and create the conditions for the next step in development.

Expected effects and result

With the help of Vinnova´s funding, we strive to overcome the most critical the challenges in the development of the technology, thereby speeding up our path towards product maturity. This funding will enable us to move from our current product, Static 3D, which is a cornerstone in the development of Live 3D, to get to initial demonstrations and thus attract customers and investors. Med Vinnovas effort we pave the way for the next generation of 3D modeling.

Planned approach and implementation

The project builds on existing GPU-implemented 3D generation modules that efficiently create geocorrected, measurable and navigable 3D models. In this first phase, we want to validate the technology by using multiple drones to produce Static 3D. In a controlled and relevant environment, we collect data, synchronize and integrate the video streams from multiple cameras to produce static 3D. The result will provide the basis for further development towards Live 3D.

The project description has been provided by the project members themselves and the text has not been looked at by our editors.

Last updated 13 November 2024

Reference number 2024-02282