
Article by Farokh Ghadially, Vice President IT and Data Centres, Schneider Electric
Artificial intelligence has entered a new phase of maturity. After years of experimentation, AI is now being deployed at scale, moving beyond centralised systems and into enterprise environments and edge infrastructure.
As AI scales, data centres are being pushed beyond their original design assumptions. Supporting higher density and more variable workloads requires an integrated approach to power, cooling and digital systems to move AI from proof of concept into everyday operation.
From experimentation to enterprise impact
Across Australia, this is no longer theoretical. The shift is already driving strong momentum, from hyperscale expansion to growing interest in sovereign AI as organisations seek greater control over their data.
As AI moves from concept to deployment, its requirements become tangible. High-density AI workloads demand significantly more power and more advanced cooling, making infrastructure a strategic consideration rather than a background function.
Power as an enabler of AI growth
AI is fundamentally changing how energy is consumed inside data centres. Global data centre capacity is projected to reach hundreds of gigawatts this decade, with AI workloads accounting for a growing share of that demand. This surge is accelerating the move toward more intelligent, digitally enabled energy systems.
As AI workloads scale and become more variable, power can no longer be treated as a static constraint. The ability to manage power and cooling dynamically is becoming central to supporting resilience, efficiency and large-scale AI deployment.
At the same time, access to power is becoming one of the most significant constraints on AI expansion. Grid connection is increasingly complex, particularly for large-scale AI facilities with volatile, high-density loads. As a result, data centre operators are beginning to rethink where and how they build, including closer alignment with renewable energy zones and alternative energy sources.
Cooling becomes a catalyst for innovation
Cooling is fast becoming one of the defining challenges, and opportunities of AI infrastructure. As the industry pushes beyond rack densities of 140kW, with future designs reaching 1MW and beyond, traditional cooling methods alone are no longer sufficient, driving a shift toward hybrid environments that combine air and liquid cooling. As AI chips become hotter and more compact, liquid cooling is essential to maintain uptime and efficiency.
With cooling accounting for up to 40 percent of a data centre’s power budget, efficiency is becoming essential. Direct liquid cooling provides a step change in performance, delivering heat removal up to 3,000 times more effective than air cooling. However, deploying it successfully requires a true end-to-end approach, from design and installation through to long-term operation.
Smarter grids through digital intelligence
AI workloads behave very differently from traditional computing. Demand can shift rapidly, creating volatility that requires a more dynamic relationship between data centres and the grid.
Digital twins are emerging as a critical energy technology capability, allowing operators and utilities to model how infrastructure will behave before it is built. By simulating different scenarios, including fluctuating AI loads, digital twins help accelerate approvals, reduce risk and build confidence across the energy ecosystem.
Building the workforce that will power AI at scale
The rapid expansion of AI infrastructure is also driving strong demand for skills in power, cooling and digital systems, yet Australia faces a growing shortage of technicians and specialists capable of designing, building and operating AI-ready data centres. Addressing this gap will require sustained investment in workforce training and partner enablement to ensure capability keeps pace with demand.
As AI moves closer to the edge, partners must also be equipped to design and deploy integrated energy and digital architectures. Collaborative enablement programs with server manufacturers and ecosystem players are critical to turning AI proof-of-concepts into production at scale, alongside prefabricated, outcome-based architectures that reduce build time and on-site complexity while improving consistency and quality.
A defining year for Australia’s AI future
2026 will be a defining year for AI in Australia, as ambition meets the realities of power, infrastructure and sustainability. Power availability, cooling capability, grid access and workforce capacity will increasingly determine where and how AI infrastructure is built. An integrated approach to energy and digital systems will be essential to supporting sustained growth.
The future of AI will not be shaped by software or hardware alone. It will be shaped by partners who can design, implement and digitalise energy systems end to end, helping organisations scale AI with confidence, efficiency and control.


















