Connectivity, climate, political stability and proximity to innovation hubs are key to success.
Generative AI has triggered a boom in data centres and AI related industries. Nvidia, the first chipmaker to reach a $1trillion valuation, has revealed Q2 data centre revenue is up 171% YoY, amid surging demand for their AI chips. 2023 is also shaping up to be a record year for data centres, with JLL’s latest report revealing that leasing demand has seen a 65% jump from H1 2022.
Meanwhile, the high density and compute power needed to train AI models is driving a fundamental change in the very design of data centres. It means everything from size, to form and geographic location is under scrutiny as developers and data centre operators scramble to keep up with demand.
It's important to note that AI training can be performed remotely, with many data centres designed to operate efficiently regardless of their location. However, strategically planning the choice of location when developing new AI ‘GigaSites’ can significantly enhance data centre operations and ESG performance.
Here are some critical success factors to consider when determining the best location for AI data centres:
Data centres used for training AI models are less reliant on latency, so proximity to the end user is not a key consideration. However, these GigaSites can reach in excess of 100kw per rack compared to the average 10kw, so access to available power infrastructure, ideally from renewable, green sources, is critical. As a result, for AI training data centres JLL expects to see a ‘bring the data centre to the power’ rather than ‘bring power to the data centre’ approach.
High-speed and reliable network connectivity are vital for transferring large volumes of data during AI model training. Access to robust power grids, dense fibre-optic networks and cloud region internet exchange points, will ensure fast and stable data transmission.
In an AI training environment, thousands of cores run complex calculations simultaneously, requiring data centres to be built much larger than before. Anticipating future growth and scalability requirements helps avoid major disruptions, so ensure there is sufficient space and potential for expansion to accommodate future needs.
AI training servers themselves need to be close together, often linked with InfiniBand connections, rather than via ethernet. This creates a very dense (and more efficient) environment, but in turn, creates challenges for cooling the server. Rear door or liquid and immersive cooling technologies are required to manage the heat produced, but selecting locations with cooler climates helps reduce overall ambient temperatures and additional pressure placed on cooling.
To improve sustainability and reduce environmental impact, these AI training data centres should, where possible, be strategically located where access to renewable energy, land and water scarcity is not an issue.
Stability of the political and legal landscape is crucial for long-term operations and investment security. AI training requires access to large datasets, so consider local data sovereignty rules and countries where free movement of data is not an issue.
The AI regulatory environment is still emerging, with the EU for example, currently developing the world's first comprehensive AI Act. This governance is vital to ensure that AI models are trained, developed and deployed in an ethical and transparent fashion while also fostering innovation, competition and investment into further AI research and development.
Choosing a location with a stable political climate, transparent legal frameworks, and a supportive business environment should help to mitigate potential risks and uncertainties as this landscape evolves.
Countries with a thriving research and innovation ecosystem, including universities, research institutions, and industry collaborations, will attract AI data centres. These ecosystems foster knowledge exchange, promote cutting-edge research and enable collaboration between academia and industry.
Proximity to potential customers, strategic partners, and industry clusters allows for easier collaboration, reduced latency in data transfer, and better customer support. It also ensures access to relevant infrastructure talent such as data centre maintenance engineers and AI researchers.
What’s more, countries with a supportive investment environment and growing market for AI technologies provide access to potential investor partnerships and funding sources, fuelling the establishment and growth of AI data centres in these locations. Take France, where President Macron has announced a €500m investment into AI, while South Korea is aiming to become one of the world’s top three AI powerhouses, by investing in AI chips and semiconductors.
In conclusion, as generative AI technology adapts and develops, more use cases will emerge, including those we could not have envisaged. Together with increasing ESG pressure, it means cloud providers, investors, developers and operators will start to take an even more strategic geographic and environmental approach to data centre site selection.