With all the new AI tech arriving in the new AI data centres – what is happening to the old tech it is presumably replacing?

AI - dirty little secret or clean?

🧠 What’s Happening to the Old Tech?

Shadow in the cloud

🔄 Repurposing and Retrofitting

  • Many traditional CPU-centric server farms are being retrofitted to support GPU-heavy or heterogeneous architectures.
  • Some legacy racks are adapted for edge computing, non-AI workloads, or low-latency services that don’t require massive AI computing power.

đź§ą Decommissioning and Disposal

  • Obsolete hardware—especially older CPUs and low-density racks—is being decommissioned.
  • Disposal is a growing concern: e-waste regulations are tightening, and sustainability targets mean companies must recycle or repurpose responsibly.

🏭 Secondary Markets and Resale

  • Some older servers are sold into secondary markets—used by smaller firms, educational institutions, or regions with less AI demand.
  • There’s also a niche for refurbished hardware, especially in countries where AI infrastructure is still nascent.

đź§Š Cold Storage and Archival Use

  • Legacy systems are sometimes shifted to cold storage roles—archiving data that doesn’t require real-time access.
  • These setups are less power-intensive and can extend the life of older tech without compromising performance.

⚠️ Obsolescence Risk

  • The pace of AI innovation is so fast that even new data centres risk early obsolescence if they’re not designed with future workloads in mind.
  • Rack densities are climbing—from 36kW to 80kW+—and cooling systems are shifting from air to liquid, meaning older infrastructure simply can’t keep up.

đź§­ A Symbolic Shift

This isn’t just about servers—it’s about sovereignty, sustainability, and the philosophy of obsolescence. The old tech isn’t just being replaced; it’s being relegated, repurposed, or ritually retired.

There’s a tech history lesson unfolding about digital mortality, and how each new AI cluster buries a generation of silicon ancestors.

Infographic: ‘New’ AI tech replacing ‘Old’ tech in data centres

🌍 The Green Cost of the AI Boom

⚡ Energy Consumption

  • AI data centres are power-hungry beasts. In 2023, they consumed around 2% of global electricity—a figure expected to rise by 80% by 2026.
  • Nvidia’s H100 GPUs, widely used for AI workloads, draw 700 watts each. With millions deployed, the cumulative demand is staggering.

đź’§ Water Usage

  • Cooling these high-density clusters often requires millions of litres of water annually. In drought-prone regions, this is sparking local backlash.

đź§± Material Extraction

  • AI infrastructure depends on critical minerals—lithium, cobalt, rare earths—often mined in ecologically fragile zones.
  • These supply chains are tied to geopolitical tensions and labour exploitation, especially in the Global South.

🗑️ E-Waste and Obsolescence

  • As new AI chips replace older hardware, legacy servers are decommissioned—but not always responsibly.
  • Without strict recycling protocols, this leads to mountains of e-waste, much of which ends up in landfills or exported to countries with lax regulations.

The Cloud Has a Shadow

This isn’t just about silicon—it’s about digital colonialism, resource extraction, and the invisible costs of intelligence. AI may promise smarter sustainability, but its infrastructure is anything but green unless radically reimagined.

⚡ The Energy Cost of Intelligence

🔋 Surging Power Demand

  • AI data centres are projected to drive a 165% increase in global electricity consumption by 2030, compared to 2023 levels.
  • In the U.S. alone, data centres could account for 11–12% of total power demand by 2030—up from 3–4% today.
  • A single hyperscale facility can draw 100 megawatts or more, equivalent to powering 350,000–400,000 electric vehicles annually.
AI and Energy supply

đź§  Why AI Is So Power-Hungry

  • Training large models like OpenAI Chat GPT or DeepSeek requires massive parallel processing, often using thousands of GPUs.
  • Each AI query can consume 10Ă— the energy of a Google search, according to the International Energy Agency.
  • Power density is rising—from 162 kW per square foot today to 176 kW by 2027, meaning more heat, more cooling, and more infrastructure.

🌍 Environmental Fallout

  • Cooling systems often rely on millions of litres of water annually. For example, in Wisconsin, two AI data centres will consume 3.9 gigawatts of power, more than the state’s nuclear plant.
  • Without renewable energy sources, this surge risks locking regions into fossil fuel dependency, raising emissions and household energy costs. We are not ready for this massive increase in AI energy production.

Just how clean is green?

The Intelligence Tax

This isn’t just about tech—it’s about who pays for progress. AI promises smarter cities, medicine, and governance, but its infrastructure demands a hidden tax: on grids, ecosystems, and communities.

AI is a hungry beast, and it needs feeding. The genie is out of the bottle!

Big tech companies are increasingly adopting nuclear power to meet the high energy demands of their AI data centres

Data centre powered by nuclear reactors

Why?

Elevated Energy Needs

AI systems, particularly generative AI, necessitate substantial computational power, leading to significant energy use. Conventional energy sources might not meet these growing demands.

Environmental Commitments

Numerous tech firms have pledged to lower their carbon emissions. Nuclear power, a low-emission energy source, supports these environmental commitments.

Dependability

Nuclear energy offers a consistent and uninterrupted power supply, essential for data centres that operate around the clock.

Technological Advancements

Progress in nuclear technologies, such as small modular reactors (SMRs), has enhanced the feasibility and appeal of nuclear power for extensive use.

For example, Google has entered into an agreement with Kairos Power for electricity from small modular reactors to bolster its AI operations. In a similar vein, Microsoft has collaborated with Constellation to refurbish an inactive reactor at the Three Mile Island nuclear facility.

These collaborations mark a notable transition in the energy strategies of the tech sector, as they pursue dependable, eco-friendly, and robust power solutions to support their AI initiatives.

UK says data centres are critical infrastructure and are designated as important as the power grid and the NHS

Critical data centres UK

UK data centres are set to be classified as critical national infrastructure (CNI), aligning them with sectors such as emergency services, finance, healthcare, and utilities

This classification will ensure they receive additional government support during major incidents like cyber-attacks, IT outages, or severe weather, to reduce disruption.

Data centres, large warehouses filled with extensive computer banks, are the backbone of services like AI applications, data processing, and streaming. Despite facing criticism for their energy and water usage, the new Labour government supports the industry, with Technology Secretary Peter Kyle referring to data centres as ‘the engines of modern life.’

Currently, the UK recognises 13 sectors as critical national infrastructure, a list last revised nine years ago with the addition of space and defence.

The 13 Critical National Infrastructure Sectors

  1. Chemicals
  2. Civil Nuclear
  3. Communications
  4. Defence
  5. Emergency Services
  6. Energy
  7. Finance
  8. Food
  9. Government
  10. Health
  11. Space
  12. Transport
  13. Water

British Technology Minister Peter Kyle announced on Thursday 12th September 2024 that UK data centres will be designated as ‘Critical National Infrastructure’ (CNI). This status, typically reserved for essential national sectors like nuclear power, provides data centre operators with a direct communication channel to the government for threat preparation and response.

Furthermore, the government has expressed support for a proposed ÂŁ3.75 billion data centre by UK company DC01UK in Hertfordshire, England, which is projected to be the largest in Europe upon completion.

Nvidia reports 122% revenue growth

Data centre

Nvidia has announced earnings surpassing Wall Street forecasts and has issued guidance for the current quarter that exceeds expectations.

As the artificial intelligence boom continues, Nvidia remains a major beneficiary. Despite a stock price dip, after trading hours, the stock has risen approximately 150% this year. The question remains whether Nvidia can sustain this growth trajectory.

Nvidia said it expects about $32.5 billion in current-quarter revenue, versus $31.7 billion expected by analysts, according to analysis That would be an increase of 80% from a year earlier.

Revenue continues to surge, rising 122% on an annual basis during the quarter, following three straight periods of year-on-year growth in excess of 200%.

Nvidia’s data centre business, which encompasses its AI processors, saw a 154% increase in revenue from the previous year, reaching $26.3 billion and representing 88% of the company’s total sales.

However, not all these sales were from AI chips. Nvidia reported that its networking products contributed $3.7 billion in revenue.

The company primarily serves a select group of cloud service providers and consumer internet firms, including Microsoft, Alphabet, Meta, and Tesla. Nvidia’s chips, notably the H100 and H200, are integral to the majority of generative AI applications, like OpenAI‘s ChatGPT.

Nvidia also announced a $50 billion stock buyback.

Nvidia shares dropped close to 5% in after-hours pre-market trade (29th August 2024).

Company says it can cut data centre energy use by 50% as AI boom places increased strain on power grids

Power hungry data centre

Major technology corporations such as Microsoft, Alphabet, and Meta are channelling billions into data centre infrastructures to bolster generative AI, which is causing a spike in energy demand.

Sustainable Metal Cloud has announced that its immersion cooling technology is 28% less expensive to install compared to other liquid-based cooling methods and can cut energy use by up to 50%.

The surge in artificial intelligence has increased the need for more robust processors and the energy to cool data centres.

This presents an opportunity for Sustainable Metal Cloud, which runs ‘sustainable AI factories’ consisting of HyperCubes located in Singapore and Australia.

These HyperCubes house servers equipped with Nvidia processors immersed in a synthetic oil known as polyalphaolefin, which is more effective at dissipating heat than air. The company claims this technology can reduce energy consumption by as much as 50% when compared to the conventional air-cooling systems found in most data centres.

Additionally, the Singapore-based company states that its immersion cooling technology is more cost-effective to install by 28% than other liquid cooling options. The HyperCubes are modular and can be integrated into any data centre, utilising spaces that are currently unoccupied within existing facilities.

What is a Hypercube?

  • Structure: A hypercube topology connects nodes in a way that each node is connected to others in a manner similar to the geometric hypercube. For example, in a 3-dimensional hypercube (a cube), each node is connected to three other nodes.
  • Scalability: This structure allows for efficient scaling. As the number of dimensions increases, the number of nodes that can be connected grows exponentially.
  • Fault Tolerance: Hypercube networks are known for their robustness. If one connection fails, there are multiple alternative paths for data to travel, ensuring reliability.

Benefits in data centres

  • High Performance: The multiple pathways in a hypercube network reduce latency and increase data transfer speeds, which is crucial for big tech companies handling vast amounts of data.
  • Efficient Resource Utilisation: The topology allows for better load balancing and resource allocation, optimising the performance of data centres.
  • Flexibility: Hypercube networks can easily adapt to changes in the network, such as adding or removing nodes, without significant reconfiguration.
  • Big Tech Companies: Companies like Google, Amazon, and Microsoft likely use hypercube topologies in their data centres to ensure high performance and reliability.
  • High-Performance Computing (HPC): Hypercube networks are also used in supercomputers and other HPC environments where efficient data transfer is critical.

How frothy is the AI data centre market for investors?

AI market froth?

Nvidia investors have been on a rocket ride to the stars. But recently they have come back down to Earth, and it has become more of a roller coaster ride.

Benefiting significantly from the artificial intelligence surge, Nvidia’s market cap has increased approximately ninefold since late 2022 – a massive market cap gain.

However, after achieving a peak in June 2024 and momentarily claiming the title of the world’s most valuable public company, Nvidia then experienced close to a 30% decline in value over the subsequent seven weeks, resulting in an approximate $800 billion loss in market capitalisation.

Currently, the stock is experiencing a rally, bringing it within approximately 6% of its all-time peak. The chipmaker surpassed the $3 trillion market cap milestone in early June 2024, aligning with Microsoft and Apple. The question remains whether the company can reclaim and sustain that title.

Investors are closely monitoring Nvidia’s forecast for the October quarter, with the company anticipated to report a growth of approximately 75%. Positive guidance would imply that Nvidia’s affluent clients continue to invest heavily in AI development, whereas a lacklustre forecast might suggest that infrastructure investment is becoming excessive.

Should there be any signs of diminishing demand for AI or if a major cloud customer is reducing spending, it could lead to a notable decline in revenue.

Europe wants to place data centres in space and Microsoft wants to place them under the sea

Space data centre

Data centres are expected to consume over 3% of Europe’s electricity demand by 2030

The surge in artificial intelligence (AI) has significantly increased the demand for data centres, essential for the ‘exploding’ tech sector. This necessity has led Europe to consider spatial alternatives for digital storage, aiming to diminish reliance on energy-intensive ground facilities.

The Advanced Space Cloud for European Net zero emission and Data sovereignty (ASCEND), a 16-month study investigating the viability of deploying data centres in orbit, has reportedly reached a ‘very encouraging‘ conclusion, according to the report.

The ASCEND study, coordinated by Thales Alenia Space for the European Commission and valued at 2 million euros ($2.1 million), asserts the technical, economic, and environmental viability of space-based data centres.

“The idea [is] to take off part of the energy demand for data centres and to send them in space in order to benefit from infinite energy, which is solar energy,” according to a spokesperson for ASCEND.

Data centres are crucial for advancing digitalization; however, they demand substantial electricity and water to operate and cool their servers. The total global electricity consumption from data centres could reach more than 1,000 terrawatt-hours in 2026 – that’s roughly equivalent to the electricity consumption of Japan, as reported by the International Energy Agency.

The ASCEND study is not alone in exploring the potential of orbital data centres. Microsoft, which has already trialed the use of a subsea data centre – positioned 117 feet deep on the seafloor, is collaborating with companies such as Loft Orbital to explore the challenges in executing AI and computing in space.

With a 20,000% increase over the past decade – has Nvidia’s stock peaked?

NVIDIA Corporation (NVDA) has experienced remarkable growth over the past decade.

Historical stock price trends

As of 10th May 2024, NVIDIA’s closing stock price stood at: $898.78

As of 10th May 2024, NVIDIA’s closing stock price stood at: $898.78

NVIDIA’s stock reached an all-time high of $950.02 on 25th March 2024. The 52-week high stands at $974.00, which is 9.7% higher than the current share price. Conversely, the 52-week low was $280.46, which is considerably below the current price.

Annual percentage changes

In 2024, the average stock price reached $763.29, marking a year-to-date rise of 79.30%.

In 2023, NVIDIA’s stock price experienced a remarkable surge of 239.02%.

Conversely, in 2022, the stock price witnessed a decline of 50.27%.

Throughout the past decade, the stock has undergone considerable volatility, exhibiting both notable gains and significant losses.

Focus

NVIDIA began as a pioneer in PC graphics and has since expanded its focus to artificial intelligence (AI) solutions. Its GPUs (graphics processing units) are pivotal in AI, high-performance computing (HPC), gaming, and virtual reality (VR) platforms.

The company’s parallel processing capabilities, powered by thousands of computing cores, are vital for executing deep learning algorithms. Additionally, NVIDIA is active in emerging markets such as robotics and autonomous vehicles.

Market position

NVIDIA holds a dominant position in the Data Centre, professional visualization, and gaming markets. Its success is bolstered by strategic partnerships with leading cloud service providers and server vendors.

Financial performance

NVIDIA’s revenue and profit have seen substantial growth over time. Its emphasis on AI and new technologies suggests a strong potential for further expansion. In summary, despite NVIDIA’s stock achieving impressive gains, it is still influenced by market trends and technological changes.

Its peak status hinges on multiple elements such as industry movements, competitive landscape, and upcoming innovations. Investors are advised to meticulously assess these factors when determining the stock’s future prospects.

Considering a long-term investment yet expecting a downturn, it might be prudent to realise some profits now, given the enormous 20,000% surge in stock value.

Take some profit and buy again after a pull-back.