Relocation: Data center cooling specialist Stuart Kay talks about his move to the US, and how global DC markets do things differently
As an English HVAC data center cooling specialist who has recently relocated to the US to lead the expansion of the AIREDALE by MODINE® data center cooling brand here, I wanted to introduce myself and share some top line observations I have made since arriving around 6 months ago. Renowned in the global data center cooling industry, the Airedale® brand is perhaps more well known in Europe so I guess I need to cover some of what I have been doing with Airedale before moving to the US. I will also cover a little on me, the differences between European and US data center cooling techniques that I have noticed since relocating, the global trends and the future of cooling as it looks right now.
AIREDALE by MODINE
AIREDALE by MODINE is Modine’s data center cooling brand. As an HVAC engineer with years of experience of working with Airedale across Europe, I figured my next move had to be to the US, to manage the expansion of the AIREDALE by MODINE brand here. So earlier this year I left the UK and moved to Wisconsin, to expand our existing global offer.
I knew I was coming to something that was already well on its way to success, with impressive production facilities in Mississippi and Virginia and our headquarters in Wisconsin, and we are just in the process of developing state-of-the-art witness test facilities in Virginia. Modine has a substantial manufacturing footprint that will allow us to grow our presence in data center cooling, right across the world. But of course I knew there would be challenges and perhaps differences, both how in we work on a personal level and how the industry operates as a whole, so I thought it might be useful to discuss some of these differences and look at what they might mean.
US v UK
I guess it’s fair to say that when I first came to the US some things were a bit of a shock to the system – the weather for one. It was so much colder than I expected it to be! But the lifestyle, especially on weekends, is so much more relaxed. I feel I have been really welcomed here and am enjoying it.
It is an exciting period for AIREDALE by MODINE and I am talking to a lot of different types of customer, including end users, consultant engineers and mechanical engineers, to really understand the market and what they need from us. From these conversations I am seeing some patterns – there are some obvious differences but also some parallels that can be drawn.
I think what I noticed first is that there are some big, dominant players in US market who take a large market share of data center cooling. However, as the industry as a whole is growing it does have capacity for smaller competitors to come in, so smaller organisations shouldn’t be put off by this.
US Geographical Trends
Looking at different regions within the US and the different ambient profiles, it is clear that there are marked differences between them in terms of the cooling technology deployed. So for example direct and indirect adiabatic cooling is frequently seen down the west coast – pretty much 100% of data center operators are putting that type of system in, especially in hyperscale settings. They work at higher temps than colocations though, so I can see why they feel they want to do that. (I will discuss more about adiabatics as a cooling mechanic later on!)
There is a lot of opportunity for organisations such as ours on the east coast near Virginia, because there is a lot of investment there in new data centers. The industry is growing and this expansion has been driven by both the trend in digital consumption and covid forcing a lot of more traditional working practices on to the internet. This is happening right across the world, however the difference between the US and Europe is some of the decision drivers. For example in the US space isn’t such an issue, where as in Europe space is at premium and so demands different cooling technologies that allow for that.
Another difference is the attitude toward environmental concerns. It appears that California is more aligned with the EU and pushes hard on sustainability and environmental issues, whereas other areas currently stick with what they always done, but I expect this will change.
Pace of Production
One factor that is seen across the board is the rate of demand, and the need to deliver quickly. Existing data centers are looking to double their capacity and new data centers are under pressure to be built, quickly. The thing that AIREDALE by MODINE is championing to address this issue, is how to simplify design to an almost cookie cutter approach, or a modular solution – creating a product or solution that is standardised in advance and can be added to, or reduced, to suit the project. The key here is that having the standardised product serves to reduce the design period – which can sometimes be as long as build period on some facilities. By having a set inventory and a flexible design with pre-agreed standards, the design period can be substantially reduced and that is very attractive to clients.
Local Production of a Global Product
Another advantage AIREDALE by MODINE products have, is that Modine is one of the largest heat exchanger manufacturers in the world. We have technology in this area that sets us apart from other companies and we can use this technology to do things other companies aren’t doing. So by standardising the design, offering up this technology and being able to manufacture and deliver across the globe, we are reducing any barriers to entry. We have 3 manufacturing bases in Europe, 2 in the US and we are looking at other countries too, meaning not only can we build and do witness tests, we can take this standard global product and manufacture it close to where the data center is being built to reduce any logistical issues and lead time issues. Because the product is all standardised, our designs are already proven on a global scale.
Sustainability within the Data Center Environment
The drive towards net zero in the industry, or even net positive in some cases, is becoming somewhat of a prerequisite to be even considered by some of the bigger players. At the moment, this seems more prevalent in Europe than it does in the US but it is coming and with the big brands on board, we all need to be ready to deliver not only effective products, but those that meet environmental targets as well as performance objective.
AIREDALE by MODINE products have for a long time been concerned with efficiency, reducing energy expenditure and maximising what is readily available. As an organisation, we invest a lot of time and money into design and apply Computational Fluid Dynamics (CFD) services, to completely understand airflow and temperature distribution, allowing us to design optimised and efficient cooling systems. Our expertise in heat exchange technology allows us to best manage air going through the units and therefore cut down on fan input power, so we have low airside pressure drop. This, teamed with high density, high surface area coils work to reduce approach temperatures. Our objective is to raise fluid temperatures to as near the air supply temperature as we can, on both air and fluid sides.
So for example if we have a free cooling chiller and an ambient air conditioning product, we want to get the ambient as high as possible to give us the full free cooling potential of that machine, so we are designing to achieve getting some quite close approach temperatures. The crucial element here is the air coming in the room has to be right air flow rate and the right air temperature – everything else is pretty much superficial. We know that most data center operators don’t care about anything else. They want plant that is efficient, easy to source and easy to maintain. The entire package, from purchase, to implementation and maintenance needs to be as efficient as possible. So the air flow rate has to be right with fluid flow temperature as close to leaving air as possible. Managing flow rate and temperature and having an integrated system is our approach.
Heat rejection and water sustainability
What mechanic to use for heat rejection is a much debated topic and as mentioned before, I have seen how popular adiabatic solutions are in some parts of the US. We can and do offer adiabatic as one solution, however we prefer not to use water as a heat rejection driver. Water is a valuable commodity and under evaluated at the moment. We see a lot being used in the US, possibly because it is plentiful in some areas, but others do have issues with water supply, for example the 20 year drought in California. I also read that in Arizona, Lake Mead is significantly down on where it was in 2000 and this isn’t a trend that is likely to reverse on its own. The massive water shortage we currently have is not sustainable and so we do need to take action, otherwise we could face the possibility of an adiabatic cooled data center leaving a city with not enough drinking water, which makes no sense. We often discuss the need to be more energy efficient, and we see the power generation industry cutting down on water use and replacing it with things such as wind and solar etc, so it is going to become a major issue for the data center industry too. The good news is that we do have the solutions to do this.
Maintenance and AI:
Maintenance in a critical industry is essential to avoid any blackout. Artificial Intelligence, or AI, is very much the future of service and maintenance of plant equipment. Optimization, such as reducing flow rates and opening out temperature differences, can be done using AI without human intervention.
In addition to this, reducing power input also works to protect increase the lifespan of component parts. So if you oversize the heat exchanger, you find the fan doesn’t have to work as hard, so all velocity of water and air has dropped and this in turn will serve to de-stress component parts. By using a high percentage of free cooling, not only are you reducing energy expenditure, but also reducing the demands of the compressors reducing run time. With less running hours on critical parts of plant it extends their life span and reduces the need to replace parts.
So with reduced stress on components, the focus of regularly manual maintenance becomes keeping the heat exchange clean. Cleaning and changing filters is a lower cost activity than having a high tech guy on site trying to either optimise the plant manually, or replace expensive components. AI and the implementation of free cooling means that maintenance can move from a high tech resource to a low tech resource, thus reducing costs and maximising efficiency.
Another key function of AI in this is predictive maintenance. For example sensors can be deployed to monitor vibration levels on plant and if something is out of balance, it suggests components may become stressed, which in turn suggests things might fail. AI can use mathematical modelling and historical failure data to calculate how and when this will occur, sending a warning to the operator to either address the issue or change a part before it completely fails.
These sorts of technologies are really user-friendly and require little training. The automated controller basically draws on past experience to predict what is likely to happen next and to suggest an action. This is a big shift towards simplifying tasks and reducing stress on component parts. It considers total ownership costs instead of initial capital outlay and this is something we know is a key issue for end users.
The new trends
Like in any industry, there are new solutions regularly being launched and pitched as “the next new thing”, threatening to make time-served and trusted methods seem old-fashioned perhaps. We recognise that for some customers, the new methods might serve them well, but not everyone wants everything, and there is always room for development.
Take for example immersion cooling. We know densities are increasing and for some data centers there is a real need to save space, more so perhaps in Europe than in the US. Of course we recognise that trying to push enough cool air through a 50kW rack is difficult, but then again, not everyone wants 50kW a rack. For a standard 16amp 4kW rack, which is still an industry standard, air cooling is the most effective method and we still see high demand for it.
What we are seeing, and expect to see continue, is the colocation white-space being sectioned to serve different client demands. So there will be a high performance computing section for those customers that want it. In our view the heat rejection will be via the same system. We know that those selling immersion liquid will recommend a dry air cooler, but we would have the technology built into our heat rejection system to work with that. Chilled water systems are flexible and can work alongside other systems, as long as distribution is organised and return temperatures are managed.
Hybrid IT is now the norm for many businesses. There is not a one size fits all solution and we have seen a lot of big companies change what they do over the years, for example move from a solely telecoms company, to a storage and edge company, so naturally their needs and approach will change. Going forward we expect that some will build their own data centers and want total control – this is especially common in the UK and Europe. Some will stick with colo, and some will blend the two. What is important is that we listen to these clients, understand where their requirements have changed and understand how we can best integrate our solutions into what they want.
Flexibility is key
We are flexible and agile in our approach. If we don’t make a client’s preferred solution then we will buy it in – we don’t shy away from that. We build our own systems so we can integrate a bought-in piece into our controls to deliver the same, seamless solution to our client.
As you might expect, we have a 5 year plan. Not revealing any secrets, we recognise a certain amount of growth in this massive digital revolution we are in, as demand continues to surge with the availability of 5G, driverless cars and other AI initiatives.
In terms of cooling, we expect internal temperatures to rise, and we are already seeing hyperscale going to limits. ASHRAE has set safe limits but we expect to go beyond them as new technology becomes available, which will lead to a reduction in the use of adiabatic cooling to save water usage.
The F-Gas directives in Europe are already making a difference to the market there, and we expect this will be followed in the US, perhaps with a target of 10% reduction in greenhouse gases for 2022, and another 30% drop by 2023, if Biden ratifies the Kigali amendment. Europe has already seen that kind of reduction so we understand how this works and have solutions available. In line with these reductions there will be a big push to A2L, which raises some issues around the use of flammability, so leakage testing will become more prevalent.
Drivers for change
The changes are likely to be driven by the big technology companies. I would expect these larger organisations to introduce policies that prevent them from working with companies who don’t comply with sustainable procedures and to audit their supply chain to confirm compliance. Many businesses are going to have to change, and these may be difficult changes but will be for the long-term good. I sincerely believe that those businesses moving forward in the data center arena will be those who are sustainable and taking these issues seriously.
My plan for the next 6 months is to keep listening to my clients, meeting new people and learning more about what they want, whilst sharing some of what I have experienced in Europe, to develop end user solutions that are energy efficient, reliable and sensible solutions to data center cooling.