Your browser doesn't support javascript. This means that the content or functionality of our website will be limited or unavailable. If you need more information about Vinnova, please contact us.

6G Master Thesis: Resource Efficient Large Language Models at the Edge

Reference number
Coordinator RISE Research Institutes of Sweden AB - RISE AB - Digitala System
Funding from Vinnova SEK 100 000
Project duration January 2025 - June 2025
Status Ongoing
Venture 6G - Competence supply
Call 6G - Supervision of degree work

Purpose and goal

The overall aim of the project is to develop computationally efficient large language models (LLMs) for use in resource-constrained 6G-edge environments. The goal of this thesis is therefore to explore energy-efficient transformer-based language models by utilizing advanced techniques such as knowledge distillation and model quantization.

Expected effects and result

Expected results from the project include a comparison based on numerical experiments between a new alternative architecture and conventional transformer models. In a larger perspective, the project has potential to contribute to improved computational efficiency and more sustainable AI services in 6G edge environments.

Planned approach and implementation

The project will be implemented in several steps. First, the student conducts a literature study to understand existing techniques and models. Then, the student conducts practical experiments to train and optimize new computationally efficient language models. These are compared with conventional transformer models through numerical experiments. Finally, the results are analyzed and documented, and the methods and results are made available as open source code.

The project description has been provided by the project members themselves and the text has not been looked at by our editors.

Last updated 20 March 2025

Reference number 2024-04254