Tiny AI Models Set to Revolutionize IoT Devices
8
What is the Viqus Verdict?
We evaluate each news story based on its real impact versus its media hype to offer a clear and objective perspective.
AI Analysis:
While the hype surrounding AI is currently high, this particular development – a genuinely effective, small-model compression technique – represents a more tangible and practical advancement with a significant long-term impact on AI’s accessibility and deployment.
Article Summary
Multiverse Computing, a European AI startup, has unveiled two remarkably small AI models, ‘ChickBrain’ and ‘SuperFly,’ that are poised to transform the landscape of embedded AI. These models, designed for incorporation into Internet of Things (IoT) devices, can operate offline and with minimal processing power. ‘ChickBrain,’ a 3.2 billion parameter model, is a compressed version of Meta’s Llama 3.1, while ‘SuperFly’ is a 94-million parameter model based on Hugging Face’s SmolLM2-135. Crucially, both models outperform their original counterparts on key benchmarks, including MMLU-Pro, Math 500, GSM8K, and GPQA Diamond. Multiverse’s innovation lies in its ‘CompactifAI’ technology, a quantum-inspired compression algorithm that maintains performance while drastically reducing model size. The company is already in discussions with major device manufacturers like Apple, Samsung, and Sony, envisioning use cases ranging from voice-controlled appliances to troubleshooting assistance on smart gadgets. The startup’s approach, coupled with an accessible API hosted on AWS, signals a potential shift towards democratizing AI deployment, particularly in resource-constrained environments.Key Points
- Multiverse Computing has developed two extremely small AI models, ‘ChickBrain’ and ‘SuperFly’, designed for IoT devices.
- Their ‘CompactifAI’ technology compresses existing AI models without sacrificing performance.
- These models outperform their original versions on key benchmarks, even when deployed on low-powered devices.

