MAD4: Cutting-Edge Technology at the Service of the Data Center and Sustainability
We visited the latest datacenter built by Digital Realty, which stands out for its energy efficiency and significant savings on its clients’ bills.
Digital Realty, formerly known as Interxion, specialises in the construction of data centre facilities, which it normally markets under the Colocation modality. Essentially, it enables its customers to bring their computing, storage and networking infrastructure into these facilities, to significantly increase their availability, cooling and connectivity to the outside world.
Due to this specialisation, these facilities are gaining enormous relevance in the hybrid multicloud world, as companies can locate their private data centres here and leave the power supply or connectivity in Digital Realty’s hands, thus being able to focus on their business.
In a recent visit to Digital Realty’s MAD4 data centre, Raquel Figueruelo, Director of Marketing, Business Development and Institutional Relations at Digital Realty Spain, explained how the optimised use of energy in these centres not only benefits clients, but also contributes to environmental sustainability.
1 – Introduction
Raquel Figueruelo began by highlighting an impressive fact: “Our PUE index is only 1.2. That customer would consume 2.5 if he were not here”. This is significant because for every digital process that occurs in the data centre, 1.3 is saved in terms of PUE (Power Usage Effectiveness). PUE is a measure of energy efficiency, and in this case, a PUE of 1.2 indicates an extremely efficient use of energy.
Cost Reduction and Energy Efficiency
Raquel emphasised that 35% of operating costs are related to electricity consumption. For this reason, the company is highly motivated to minimise this consumption, without sacrificing operability. “We are the most interested in keeping electricity consumption as low as possible,” she said. However, she also made it clear that it is impossible to cut power consumption completely, as digital processes must continue to operate uninterrupted.
One of the key points Raquel mentioned is the comparison of energy consumption between professional and non-professional data centres. In absolute terms, data centres consume large amounts of energy, with figures as high as 65 to 100 gigawatt hours. However, in a non-professional environment, cooling is much less efficient, resulting in even higher energy consumption. “Those 100 gigawatt hours are preventing another 100 gigawatt hours from being consumed,” he explained.
Digital Realty invests significantly in data engineering to keep the PUE as low as possible. Raquel commented: “We spend money to keep this 1.2 ratio as low as possible. I wish it was zero, but that’s impossible.” Achieving a PUE of zero is a utopia due to the inherent cooling and space requirements of servers.
The Balance Between Space and Cooling
Raquel also stressed the importance of finding the right balance between the physical space the servers occupy and the need for cooling. “For that we would put a rack in the middle of a 100 square metre room. And that rack would not need cooling,” he said, explaining that while this would reduce the need for cooling, it is not a practical solution due to space constraints and operational efficiency.
2 – Electromechanics
On the visit to the data centre, Figueruelo introduces us to the electromechanical section, which is essential for the power supply and cooling of the servers. This aspect of the centre is key to ensuring that the servers operate under optimal conditions and without interruption. Below are the innovations and features that make this data centre unique in its class.
High Voltage Power Supply and Scalability
One of the main differences of this data centre is its high voltage power supply. Unlike enterprise data centres that operate on low or medium voltage, here 45,000 volts is received via a Union Fenosa substation. This power is transformed internally into two electrical feeds which, in turn, are divided into four separate electrical systems. Each system feeds a specific room, ensuring that a power failure in one room does not affect the others.
Currently, the data centre has one operational floor that includes four rooms. As the centre grows, more floors will be added, and each new room will have its own independent electrical system. The goal is to build 24 Data Processing Centres (DPCs) within the building, each with about 700 square metres. This modular design ensures that each DPC is self-contained and can operate without interference from others.
The choice of a high-voltage power supply not only increases upfront costs for generators and transformers, but also offers significant benefits in terms of energy efficiency. By managing its own substation, the data centre minimises power losses, which can be as high as 8% in distribution from external substations. This efficiency measure allows every kilowatt hour consumed to be used more effectively in digital processes, reducing energy waste.
Cooling and Energy Efficiency
In addition to power management, energy efficiency extends to cooling and lighting systems. These systems are designed to consume as little energy as possible, ensuring that most of the energy is allocated to the load of the digital process. This strategy is crucial for long-term sustainability and savings.
3 – High Voltage Transformers
Efficiency and stability in the power supply are essential for the operation of modern data centres. In this context, high voltage transformers play a crucial role. On our recent visit to the data centre, we had the opportunity to get a close look at how these transformers are managed and the innovations they are implementing to optimise performance and reduce energy losses.
High-voltage transformers are critical to convert the power received from the grid into a voltage suitable for the data centre’s internal consumption. In this specific case, power is received at 45,000 volts and transformed to 10,000 volts by two transformer centres. This configuration allows the power to be handled efficiently and safely.
One of the most outstanding features of this data centre is its modular construction. As our guide explained, the infrastructure is developed in phases, allowing the best available technologies to be incorporated at any given time. This approach not only improves operational efficiency but also ensures that the data centre is always equipped with the latest equipment.
The data centre has four transformers, two of which act as backup and are located in a separate area. This redundancy is essential to ensure continuity of service in the event of failures or maintenance.
Direct Grid Connection
One of the most significant innovations in this data centre is its direct connection to the high-voltage power grid, avoiding conventional distribution. This approach not only reduces the energy losses associated with distribution but also helps to stabilise the power grid as a whole. Data centres, by having a constant and predictable load, help to balance fluctuations in daily electricity demand, improving the efficiency of the overall power system.
The long-term goal of the data centre is to integrate even more deeply into the grid, becoming a kind of “microgrid” that stabilises load and reduces costs associated with peak demand. This strategy not only benefits the data centre but also has positive implications for the electricity provider and, ultimately, for consumers.
4 – Power Systems and Renewable Energy
Data centres are at the heart of many companies’ operations in today’s era, housing large volumes of information and providing critical services. Efficient energy management is essential to ensure uninterrupted and sustainable operation. In this article, we will explore how data centres use uninterruptible power supplies (UPS) and batteries to ensure a reliable and clean power supply, as well as their commitment to using 100% renewable energy.
Power Transformation: UPS Power Filtration
On entering the data centre, the power supply is fed at 45,000 volts and transformed into 10,000 volts. This power is then split and transformed back to a medium voltage of 10,000 volts before being reduced to 400 volts, which is the voltage used by the computer equipment.
Armoured busbars, also known as armoured busbars, play a crucial role in the safe distribution of energy. These structures allow the efficient transport of large amounts of energy without the risk of overheating or fire, ensuring the safety and continuity of the power supply.
UPSs act as filters, removing impurities and ensuring that the power supplied to equipment is constant and free of fluctuations. This process is essential to maintain the stability of computer systems and to avoid interruptions due to voltage peaks or mains failures.
Battery back-up and medium voltage generators
In the event of a mains failure, backup batteries kick in, providing power for a short period (between 6 and 10 minutes) until the generators kick in. These batteries are capable of sustaining the full load of the data centre, ensuring that services are not interrupted.
The generators, designed to operate at 10,000 volts, ensure a smooth and efficient transition when the backup system is activated. These generators send power to the transformer, which then delivers it filtered to customer equipment, maintaining supply stability without users noticing the difference between grid and internally generated power.
One of the key promises of the data centre is the use of 100% renewable energy. They have signed a power purchase agreement (PPA) with ACCIONA, guaranteeing a continuous supply of renewable energy for ten years. In addition, they use a blockchain platform to certify the source and amount of energy used, ensuring transparency and sustainability in their operations.
Monitoring, Drills and Blackouts
The data centre implements advanced monitoring systems, including Apollo software, which enables continuous surveillance of each battery and other critical components. This system facilitates predictive maintenance, identifying potential failures before they occur and enabling proactive interventions to maintain operability.
To ensure the effectiveness of its systems, the data centre conducts annual blackout drills. During these events, the main power supply is cut off and the backup systems are checked to ensure that they are functioning properly. This process ensures that, in the event of a real emergency, the systems will respond effectively and quickly.
5 – Generators and their Importance in Data Centres
In today’s digital world, the infrastructure that supports our data and online services is vital. Data centres play a crucial role in this ecosystem, and one of the most important components of these centres are power generators. Raquel Figueruelo, an expert in the field, gives us a detailed insight into the operation, innovation and sustainability of these generators.
Active Generators and Future Installations
In the data centre visited, there are currently five active generators, each equipped with its own chimney. In addition, a further five generators are being installed, underlining the importance of having robust backup systems. These generators are designed to spring into action in the event of a power outage, ensuring that the data centre can continue to operate without interruption.
The generators use diesel as their primary fuel, but also have the ability to run on biodiesel. This is part of a broader effort to reduce the carbon footprint and move towards greater sustainability. One of the fuel tanks already contains HVO (hydrotreated vegetable oil), a type of biodiesel.
A distinctive feature of these generators is that they generate power at medium voltage. This allows the energy to be transformed and filtered more efficiently before being used by the data centre.
One of the most innovative aspects of these generators is how they reuse the heat generated. Excess hot air from the data centre rooms is converted into hot water through a chiller. This hot water is used to keep the generators in a state of “hibernation” or idling, making it easier to start them when needed. This process not only improves operational efficiency but also contributes to meeting climate neutrality regulations.
Climate Neutrality Pact regulations
The data centre is committed to the Climate Neutrality Pact, which sets out a number of measures to be met by 2030:
- Green Energy: 100% of the energy used must be from renewable sources.
- Equipment Recycling: All equipment must be properly recycled.
- Energy Efficiency (PUE): The data centre must maintain an optimal energy efficiency ratio. In hot climates such as Madrid, the PUE should not exceed 1.4, although the centre has already designed systems to achieve a PUE of 1.2.
- Reuse of Heat: Commitment to reuse the heat generated, although there is currently no street heating network in Madrid.
In other countries, surplus heat from data centres is used to heat hospitals or heated swimming pools. However, in Madrid, the lack of a street heating network and the low quality of the extracted heat (between 28 and 31 degrees) limit its external reuse. Therefore, the data centre reuses this heat internally to keep the generators in a state of hibernation, which optimises their performance and reduces long-term operating costs.
While the investment in heat reuse technology and efficient generators is high, the industry’s commitment goes beyond the immediate cost-benefit. The carbon footprint of the centre is remarkably low, with only 252 tonnes of CO2 generated in 2023, which is less than that generated by ten people combined annually.
Operation, Maintenance and the future of Generators
The generators are not in constant operation. In fact, it is estimated that they operate less than 18 hours per year, including testing and emergency situations. Recently, during maintenance with Union Fenosa, the generators were activated for eight hours to avoid interference at the substation.
The future of data centres includes consolidation of power lines and increased reliance on generators for backup. Centres with capacity of more than 20 megawatts will likely have a single main line and two levels of backup, reflecting an evolution in the design and management of these critical centres.
6 – Data Centre Cooling System
Today, efficient data centre management is crucial to the operation of countless digital services. Figueruelo guided us through a detailed tour in which he broke down the sophisticated cooling system implemented in his data centre. This system not only ensures uninterrupted operation, but also optimises energy and water usage, making the centre a model of sustainability and efficiency.
MAD4 cooling system at Digital Realty
Chillers and the Cooling Process
The cooling system is a fundamental piece in the data centre, where large machines are responsible for maintaining the right temperature for the optimal functioning of the equipment. These chillers are essential, as they regulate the temperature of the air circulating in the server rooms. The process is complex and meticulous: the cold air passes through a machine that generates additional cold as needed, adjusting its operation according to the initial air temperature.
One of the most remarkable features of this system is its energy efficiency. The system reduces energy consumption by cooling the air more effectively, achieving a Power Usage Effectiveness (PUE) of 0.2, an impressive industry metric. This level of efficiency is due to the system’s ability to operate for longer hours at optimal temperatures, decreasing energy usage compared to less advanced systems.
Capacity and Scalability
The cooling system is designed to scale with the needs of the centre. Currently, each server room has its own cooling system, and space has been left for more units to be added as demand grows. In total, 32 cooling units are planned to be installed to ensure adequate cooling throughout the data centre.
While the centre is not seeking Tier 4 certifications, which require a dual cooling system, its focus is on efficiency and optimal resource management. Tier 4 certification requires duplicate infrastructure, which does not necessarily increase efficiency. Instead, this facility prioritises the responsible use of water and electricity, operating with a closed water loop that minimises consumption and maximises reuse.
Closed Water Circuit and Adaptation to Climatic Conditions
The use of a closed water circuit is a key feature of the cooling system. This circuit allows cold water to flow down into the technical rooms, absorbing the heat generated by the equipment and returning as hot water to be cooled again. This cyclical process is not only energy efficient, but also significantly reduces water consumption.
The system is designed to adapt to various climatic conditions, including the need to heat the air during winter to maintain adequate humidity and prevent condensation. This ensures that the air supplied to the server rooms is always at the optimum temperature of 18 degrees, regardless of the outside temperature.
7 – Energy management in data centre client rooms
Infrastructure management in a data centre is crucial to ensure the continuous and efficient performance of the systems. The guided tour also took us through the client rooms, which are notable for the innovative energy efficiency strategies and cooling technology used.
Power systems and redundancy
The data centre has four power systems: one main and three backup systems. Each server rack has two electrical outlets, one connected to the main system and one to a backup system. This configuration ensures that even if one of the power systems fails, the servers will continue to operate thanks to the redundant connections.
The cooling machines are essential to maintain the right temperature in the data centre. The cold air is generated in a closed water circuit and driven under a metre-high false floor, optimising air distribution without requiring a lot of energy. This system ensures greater energy efficiency by reducing the force required to move the air.
The server racks are designed to enclose the cold air in what is known as the “cold aisle”. Here, the servers take in the cool air and expel the hot air into the “hot aisle”. This separation prevents recirculation of hot air and maximises cooling efficiency.
The data centre uses decorative red pipes that constantly analyse the air quality. If smoke is detected, a gas extinguishing system is automatically activated to protect the equipment, avoiding the use of water that could damage the equipment.
Customised customer rooms
Customer rooms are designed to be highly customisable. Each customer can install their own equipment according to their needs, and the data centre continuously measures temperature and humidity to adjust cooling accordingly. Trays for data cabling are carefully organised to maintain order and efficiency.
Each rack in the data centre can provide up to 70 kilowatts of power, equivalent to the power consumption of a 25-dwelling building. This high capacity is essential to support advanced applications such as artificial intelligence. If requirements exceed 70 kilowatts per rack, water cooling, which directly cools the chips or servers, is considered.