HomeEnterprise ITArtificial IntelligenceAI Is Pushing India's Data Centres to a Breaking Point on Energy

AI Is Pushing India’s Data Centres to a Breaking Point on Energy

AI workloads are pushing data centre power demands from 15 kW to over 100 kW per rack, forcing operators to redesign infrastructure or face unsustainable costs. Industry leaders say sustainability is no longer optional.

Preferred Source of Google

Key Points

  • AI workloads pushing rack densities from 15 kW to 60-120 kW per rack
  • Facilities efficient two years ago now face unsustainable power costs
  • Industry leaders call this the defining inflection point for data centres

Two years ago, a data centre that consumed 15 kilowatts per rack was considered efficient. Today, artificial intelligence workloads are demanding four to eight times that amount, and the facilities that once represented best practice are now haemorrhaging money on electricity bills. For operators across India, the maths has changed faster than their infrastructure.

Jaideep Roy, Director of Business Development at Vertiv, puts bluntly. The problem is no longer about raw computing power. It is about the energy waste that comes with it. “Facilities that were efficient two years ago are now facing power bills that threaten profitability and grid constraints that block expansion,” he said. “We see this as the defining inflection point for the entire sector.”

Advertisement
Saksham Bharat 2026
Saksham Bharat 2026
A multi-stakeholder dialogue on skilling gap in Cybersecurity, Data Resilience and AI — and the roadmap to a Saksham Bharat.
Register Now →
VeeamON 2026 Tour India - Mumbai
VeeamON 2026 Tour India - Mumbai
A VeeamON 2026 India Leadership Series Mumbai for senior public sector and government technology leaders.
Register Now →
Cyber Surakshit Uttar Pradesh
Cyber Surakshit Uttar Pradesh
Find out strategies, frameworks and solutions for building a resilient and secure digital ecosystem across Uttar Pradesh.
Register Now →
VeeamON 2026 Tour India - Bengaluru
VeeamON 2026 Tour India - Bengaluru
A VeeamON 2026 India Leadership Series Bengaluru for senior public sector and government technology leaders.
Register Now →
VeeamON 2026 Tour India - Delhi
VeeamON 2026 Tour India - Delhi
A VeeamON 2026 India Leadership Series Delhi for senior public sector and government technology leaders.
Register Now →
Infosec Reimagined
Infosec Reimagined
Infosec Reimagined 2026 is the premier information security summit where top leaders—CISOs, CROs, CIOs, CTOs and risk executives—converge to redefine cyber resilience.
Register Now →
Digital Senate
Digital Senate
Digital Senate is a premier conference uniting government leaders, technologists and innovators to share ideas, success stories and strategies on digital governance, public sector transformation, cybersecurity and emerging technologies in India.
Register Now →
CIO Prism
CIO Prism
CIO Prism unites forward-thinking technology leaders to exchange transformative insights, shape digital strategies, and foster innovation, empowering enterprises to excel in an era of rapid technological change.
Register Now →

This inflection point raises a question that every data centre operator in India must now answer: can the country’s digital infrastructure grow fast enough to meet AI demand without overwhelming the power grid and the planet? The answer will shape not only the data centre industry but also the cost and availability of every AI-powered service that Indian consumers and businesses increasingly depend upon.

Why AI workloads are breaking legacy infrastructure

The core challenge is physics. Traditional data centres were designed around air cooling, a technology that works well when each rack of servers generates modest heat. AI workloads, particularly those training large language models or running real-time inference, generate far more heat per square metre than conventional computing tasks.

Roy explained that AI-driven workloads are rapidly pushing rack densities beyond the limits of legacy air-cooled designs. The numbers are stark: densities are rising from around 15 kW to as high as 60-120 kW per rack. To put this in perspective, a single high-density AI rack can now consume as much power as 40 average Indian households.

Advertisement

Air cooling systems simply cannot remove heat fast enough at these densities. The result is either throttled performance, as servers slow down to avoid overheating, or enormous energy bills for ever more powerful air conditioning. Neither outcome is sustainable.

Liquid cooling and integrated design as the new baseline

The industry’s response is a fundamental redesign of how data centres manage heat. Liquid cooling, which pipes coolant directly to processors, can remove heat far more efficiently than air. Vertiv and other infrastructure providers are positioning this technology not as an upgrade but as a prerequisite for AI-scale operations.

“Our integrated liquid cooling architectures and intelligent power distribution platforms are purpose-built for exactly this reality,” Roy said. “They slash energy waste at the rack level, maintain rock-solid uptime and let customers support dramatically higher compute density without proportional increases in power draw or infrastructure footprint.”

Advertisement

The key phrase is “without proportional increases”. If a data centre can quadruple its computing output while only doubling its energy consumption, the economics shift dramatically. For operators facing both rising AI demand and pressure from regulators and investors on carbon emissions, this efficiency gap is existential.

Sustainability as operational necessity, not marketing

The framing of sustainability is changing. For years, data centre operators treated environmental credentials as a communications exercise, useful for annual reports and investor presentations but secondary to uptime and cost. That hierarchy is collapsing.

Narendra Sen, CEO and founder of RackBank Data Centers, argues that sustainability must now be embedded into every rack, every watt and every design decision. “Digital growth and environmental stewardship are not in conflict; they are complementary,” he said. “The industry must collectively reaffirm its commitment to decarbonising digital infrastructure at scale and building an AI-ready future that the planet can afford.”

Pratap Mane, President and Country Head for India at Colt Data Centre Services, frames this as a question of long-term viability. “The industry’s long-term success will be defined not just by its ability to scale, but by how efficiently and sustainably that scale is delivered,” he said.

Sustainability, in his view, is becoming fundamental to the future of digital infrastructure rather than a complementary objective.

The counterargument: can green infrastructure keep pace with demand?

Not everyone is convinced that efficiency gains can outrun demand growth. The challenge is straightforward: if AI adoption continues to accelerate, even highly efficient data centres will consume vastly more power in absolute terms. A facility that uses half the energy per computation still doubles its total consumption if it handles four times as many computations.

India’s power grid, while expanding, faces its own constraints. Renewable energy capacity is growing but remains insufficient to meet existing demand in many states. Data centres competing for grid access with manufacturing, agriculture and residential users may find that efficiency is necessary but not sufficient.

The industry’s response has been to pursue distributed infrastructure. Rather than concentrating capacity in a few massive facilities, operators are building smaller data centres closer to users in emerging cities. This reduces transmission losses and, in theory, makes it easier to integrate local renewable generation.

Blue Cloud Softech Solutions, for instance, is developing what it calls Edge AI capabilities alongside its Blue Energy platform, designed to enable renewable and distributed energy adoption at scale.

What this means for Indian businesses and consumers

The infrastructure decisions being made today will determine the cost and performance of AI services for years to come. If data centre operators successfully transition to efficient, renewably powered facilities, AI-driven from healthcare diagnostics to agricultural forecasting could become cheaper and more widely available. If they fail, rising energy costs will be passed on to customers, and grid constraints could limit where and how quickly new capacity can be built.

For businesses considering AI adoption, this creates a new due diligence question. The sustainability credentials of cloud and data centre providers are no longer just about corporate responsibility; they are a proxy for operational resilience and long-term cost stability.

Roy’s characterisation of this moment as a defining inflection point is not hyperbole. The choices made now, in cooling technology, power sourcing and infrastructure design, will lock in patterns of energy consumption for decades.

The operators treat efficiency as foundational rather than optional will likely be the ones still operating profitably when the next generation of AI workloads arrives demanding still more power.

Your Questions, Answered

Why are AI workloads creating problems for data centres?

AI tasks generate far more heat than traditional computing. Rack power densities are rising from 15 kW to 60-120 kW, overwhelming air cooling systems designed for lower loads.

What is liquid cooling and why does it matter?

Liquid cooling pipes coolant directly to processors, removing heat more efficiently than air. It allows data centres to handle AI workloads without proportional increases in energy consumption.

How does data centre energy use affect Indian consumers?

Rising data centre costs are passed on to cloud and AI service users. Efficient, sustainably powered facilities should keep AI services affordable and widely available.

Can renewable energy meet growing data centre demand in India?

Renewable capacity is growing but faces constraints. The industry is pursuing distributed infrastructure and local generation to reduce reliance on the central grid.

Get the day's headlines from Tech Observer straight in your inbox

By subscribing you agree to our Privacy Policy, T&C and consent to receive newsletters and other important communications.
Mohd Ujaley
Mohd Ujaley
Mohd Ujaley is a journalist specialising in the intersection of technology with government, public sector, defence and large enterprises. As Editorial Director at Tech Observer Magazine, he leads editorial strategy, moderates industry discussions and engages with key stakeholders to shape conversations around technology, policy and digital transformation. With over 15 years of experience, Ujaley has held editorial roles at prestigious publications including The Economic Times, ETGovernment, Indian Express Group, Financial Express, Express Computer and CRN India. He holds a Bachelor’s degree in Business Economics, a Master’s in Mass Communication from Guru Gobind Singh Indraprastha University (GGSIPU), a Parliamentary Fellowship from The Institute of Constitutional and Parliamentary Studies and a Certificate in Public Policy from St. Stephen’s College, Delhi.
- Advertisement -
Powered By Veeam Logo
- Advertisement -

Subscribe to our Newsletter

By subscribing you agree to our Privacy Policy, T&C and consent to receive newsletters and other important communications.
- Advertisement -

AI agents break legacy security models, Veeam CEO warns at VeeamON

Veeam Software CEO Anand Eswaran says zero-trust security models built for human users have broken down as autonomous AI agents move inside enterprises at machine speed, and that recovery, identity and data governance can no longer be treated as separate problems.

RELATED ARTICLES