AI Power Management Startup Niv-AI Emerges to Tackle GPU Energy Waste

12

A new company, Niv-AI, has launched with $12 million in seed funding to address a critical bottleneck in the booming AI industry: inefficient power usage in data centers. The core problem is that advanced GPUs, essential for training and running AI models, create unpredictable surges in electricity demand, forcing operators to either limit performance or pay extra for energy storage. This translates to wasted investment and slower AI development.

The Power Problem with AI

As Nvidia CEO Jensen Huang bluntly stated, every unused watt of power in an AI factory represents lost revenue. The issue stems from the millisecond-scale power fluctuations inherent in modern GPU workloads. When thousands of GPUs rapidly switch between tasks, they create spikes that overwhelm existing power management systems. Data centers react by either throttling GPU usage (reducing performance) or overpaying for backup energy reserves.

The scale of waste is significant. Industry experts estimate that up to 30% of potential GPU capacity is left untapped due to power constraints. This isn’t just an economic issue; it also hinders the progress of AI research and deployment.

Niv-AI’s Approach: Real-Time Monitoring and AI-Driven Prediction

Founded in Tel Aviv, Niv-AI is taking a two-pronged approach:

  1. Precise Power Measurement: The company is deploying rack-level sensors to monitor GPU power consumption at the millisecond level. This detailed data will reveal the specific power profiles of different deep learning tasks.
  2. AI-Powered Optimization: Niv-AI plans to train an AI model on the collected data to predict and synchronize power loads across entire data centers. The goal is to create a “copilot” for data center engineers, allowing them to maximize GPU usage without overloading the grid.

Why This Matters Now

The timing is crucial. Hyperscalers (large cloud providers) are facing hurdles in building new data centers due to land-use restrictions and supply chain issues. Improving existing infrastructure efficiency is now more attractive than ever. The current system forces a trade-off between performance and stability, and Niv-AI aims to resolve that.

“The grid is actually afraid of the data center consuming too much power at a specific time,” explains Niv-AI CEO Tomer Timor. The company’s vision is to create a missing “intelligence layer” between the data center and the electrical grid.

Niv-AI expects to have a working system in US data centers within 6–8 months. The startup is backed by prominent venture firms including Glilot Capital and Grove Ventures.

The long-term implication is clear: as AI continues to grow, the demand for electricity will only intensify. Companies like Niv-AI are essential for ensuring that this growth is sustainable and efficient.