Multiverse Debuts CompactifAI App for Compressed AI Models
In Focus
- Multiverse has embedded Gilda AI model in CompactifAI
- The AI tool runs directly on user devices
- CompactifAI works like a Le Chat and ChatGPT chatbot
Spanish AI startup Multiverse Computing is introducing CompactifAI app. According to TechCrunch, the new tool works as an app for showcasing mainstream compressed AI models and an API portal where developers can build with those models. The Multiverse AI model compression highlights the emerging trend where agentic tools run directly on user devices, instead of relying on external compute infrastructure.
CompactifAI Runs on User Devices
Like the Mistral-owned Le Chat and OpenAI’s ChatGPT, users can ask CompactifAI questions and get AI-generated responses. The main difference is that Multiverse has embedded the Gilda model in its AI chat tool. Gilda is a small AI model that operates locally on user devices, without requiring an internet connection.
Multiverse’s AI model compression technology is ideal for many end users because it keeps data in their devices. However, a major downside is that users must have sufficient storage space and RAM in their devices.
If there is a shortage, the CompactifAI switches to cloud-based models. The shift from local to cloud-based processing is automatically supported by a system called Ash Nazg. With this change, users lose the privacy that comes with local device processing. These limitations hinder mass adoption of CompactifAI with less than 5,000 users downloading the app last month.
Strong Focus on Enterprises and Developers
Multiverse also launched a self-serve API portal that offers users real-time usage. Through this portal enterprises and developers can access AI models without having to use the AWS marketplace.
“The CompactifAI API portal now gives developers direct access to compressed models with the transparency and control needed to run them in production,” Multiverse CEO Enrique Lizaso noted.
With the API portal, developers and enterprises can deploy AI products faster and reduce the compute costs. These benefits are among the reasons why enterprises are considering local models as a viable alternative to compressed large language models.
The Spanish startup introduced Multiverse AI models API a day after Manus launched ‘My Computer’, a desktop app that integrates its AI agent into personal laptops.
Meta acquired Manus last year as part of a strategy to accelerate enterprise-grade AI. With the new AI tool, Meta and Manus’ agents edge closer to OpenClaw, the open-source AI assistant that runs directly on users’ local devices.
What’s Possible With Locally Run AI Models
For professionals in critical sectors, locally run models that do not rely on cloud connectivity can offer privacy and support operations. They can enable AI deployment in drones, satellites, and other environments where reliable connectivity cannot be assured.
However, shrinking AI models to run effectively on mobile devices without sacrificing performance remains a challenge for tech companies. Apple Intelligence addressed this challenge by blending on-device processing with cloud-based support.
With CompactifAI, Multiverse aims to demonstrate that local models like Gilda can offer benefits that extend beyond cost efficiency.
