Overview of Multiverse Computing’s Breakthrough
Multiverse Computing has successfully raised €189 million (around $215 million) in a Series B funding round. This funding will help advance its innovative technology, CompactifAI, which compresses large language models (LLMs) significantly. The company claims that its technology can shrink the size of LLMs by up to 95% while maintaining their performance. This advancement positions Multiverse as a key player in the quantum computing-inspired tech landscape.
Key Details About the Funding and Technology
- CompactifAI allows Multiverse to offer compressed versions of popular open-source LLMs, such as Llama 4 Scout and Mistral Small 3.1.
- The company’s models, termed “slim,” are available through Amazon Web Services or can be licensed for local use.
- These models are 4 to 12 times faster than their non-compressed counterparts, resulting in a substantial reduction in inference costs.
- Multiverse aims to make its models small enough to run on various devices, including PCs, phones, and even Raspberry Pi units.
Significance of the Funding and Future Prospects
The funding round was led by Bullhound Capital, with participation from several notable investors. This financial boost will help Multiverse expand its offerings and customer base, which already includes major companies like Iberdrola and Bosch. With 160 patents and 100 global customers, Multiverse is well-positioned to lead advancements in model compression, making AI technology more accessible and efficient. The implications of this technology could revolutionize how AI is integrated into everyday devices, paving the way for more intelligent and responsive systems.











