ViqusViqus
Navigate
Company
Blog
About Us
Contact
System Status
Enter Viqus Hub

Startup Targets Data Center Power Waste

Artificial Intelligence Data Centers GPU Niv-AI Power Management Deep Learning Data Center Optimization
March 17, 2026
Source: TechCrunch AI
Viqus Verdict Logo Viqus Verdict Logo 6
Operational Efficiency, Not Revolution
Media Hype 4/10
Real Impact 6/10

Article Summary

Niv-AI is addressing a critical and increasingly urgent challenge: the massive power consumption of AI data centers. As the demand for compute power continues to soar with advancements in generative models and large language models, data centers are struggling to manage their relationship with the electrical grid. The startup’s approach centers on granular data collection using rack-level sensors, providing millisecond-scale insights into GPU power usage. This data feeds into an AI model designed to predict and synchronize power loads across the data center – essentially a ‘copilot’ for data center engineers. The core problem isn’t just the sheer volume of power consumed, but also the unpredictable surges that occur as GPUs switch between computation tasks and communicate with each other. These surges force data centers to throttle GPU usage or pay for temporary energy storage, reducing the return on expensive chips. The $12 million seed funding, backed by a strong group of investors, provides the capital needed to deploy this technology within a handful of US data centers within the next 6-8 months. The timing is particularly relevant as hyperscalers face challenges in building new data centers due to land-use constraints and supply chain bottlenecks. Niv-AI’s solution offers a tangible path to unlocking existing capacity and establishing more responsible power profiles between data centers and the grid.

Key Points

  • Niv-AI has secured $12 million in seed funding to develop power management solutions for AI data centers.
  • The company’s technology utilizes millisecond-level sensor data to understand and predict GPU power usage.
  • Their AI ‘copilot’ aims to optimize GPU utilization and synchronize power loads, reducing strain on the electrical grid.

Why It Matters

The issue of data center power consumption is rapidly becoming a major constraint on AI development and deployment. As AI models grow ever more complex and demanding, the pressure on the electrical grid will only intensify. Niv-AI's approach isn’t merely a technological tweak; it represents a fundamental shift in how data centers manage their resources, potentially mitigating future bottlenecks and contributing to a more sustainable AI ecosystem. A failure to address this growing disparity between compute demand and grid capacity will significantly hinder the pace of innovation in the field.

You might also be interested in