Article
News
2
min read
James Dice

AI Data Centers May Need Fewer Chillers—and More Precision from Mechanical and Controls Teams

January 14, 2026

Nvidia’s latest roadmap for high-density AI infrastructure is being read as a warning shot for traditional data center cooling—and for the service providers who build and maintain it.

In a recent LinkedIn analysis, industry veteran Tony Grayson points to Nvidia’s CES announcement that its upcoming platforms can operate with liquid cooling supply temperatures around 45 °C. At that temperature, many AI data centers could rely far less on conventional chilled-water plants and compressors, using dry coolers or other non-mechanical heat rejection for much of the year.

If adopted at scale, that shift could materially reduce energy use tied to cooling. Fewer compressor hours mean lower electrical demand, better PUE, and potentially smaller central plants. For owners building data centers, that opens the door to deferring or downsizing expensive chiller infrastructure.

For mechanical and controls service providers, this impacts scope. Warm-water liquid cooling puts far more pressure on pump reliability, flow control, leak detection, and water chemistry management. Commissioning tolerance tightens. Controls sequences become more mission-critical. Failure modes look different—and often harder to troubleshoot.

The “death of the chiller plant” framing is clearly aspirational. Even the article acknowledges that trim chillers or adiabatic assist will still be required in many climates and operating conditions. But the direction of travel is clear: less work centered on large centrifugal machines, more work in precision liquid handling and control.

Service firms that stay anchored to legacy chiller-centric scopes may find themselves exposed as AI workloads drive the next wave of data center builds.

If you’d like to learn more, here are some ways to stay updated on stories like this:

  • Read the original analysis on LinkedIn.
  • Sign up for the Nexus Labs newsletter to get five similar stories for owners each Wednesday: 

Sign Up for Access or Log In to Continue Viewing

Sign Up for Access or Log In to Continue Viewing

Nvidia’s latest roadmap for high-density AI infrastructure is being read as a warning shot for traditional data center cooling—and for the service providers who build and maintain it.

In a recent LinkedIn analysis, industry veteran Tony Grayson points to Nvidia’s CES announcement that its upcoming platforms can operate with liquid cooling supply temperatures around 45 °C. At that temperature, many AI data centers could rely far less on conventional chilled-water plants and compressors, using dry coolers or other non-mechanical heat rejection for much of the year.

If adopted at scale, that shift could materially reduce energy use tied to cooling. Fewer compressor hours mean lower electrical demand, better PUE, and potentially smaller central plants. For owners building data centers, that opens the door to deferring or downsizing expensive chiller infrastructure.

For mechanical and controls service providers, this impacts scope. Warm-water liquid cooling puts far more pressure on pump reliability, flow control, leak detection, and water chemistry management. Commissioning tolerance tightens. Controls sequences become more mission-critical. Failure modes look different—and often harder to troubleshoot.

The “death of the chiller plant” framing is clearly aspirational. Even the article acknowledges that trim chillers or adiabatic assist will still be required in many climates and operating conditions. But the direction of travel is clear: less work centered on large centrifugal machines, more work in precision liquid handling and control.

Service firms that stay anchored to legacy chiller-centric scopes may find themselves exposed as AI workloads drive the next wave of data center builds.

If you’d like to learn more, here are some ways to stay updated on stories like this:

  • Read the original analysis on LinkedIn.
  • Sign up for the Nexus Labs newsletter to get five similar stories for owners each Wednesday: 

⭐️ Pro Article

Sign Up for Access or Log In to View

⭐️ Pro Article

Sign Up for Access or Log In to View

Are you interested in joining us at NexusCon 2026? Register now so you don’t miss out!

Join Today

Are you a Nexus Pro member yet? Join now to get access to our community of 600+ members.

Join Today

Have you taken our Smart Building Strategist Course yet? Sign up to get access to our courses platform.

Enroll Now
Conversation
Comments (-)
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Guest
6 hours ago
Delete

This is a great piece!

REPLYCANCEL
or register to comment as a member
POST REPLY
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Guest
6 hours ago
Delete

I agree.

REPLYCANCEL
or register to comment as a member
POST REPLY
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Get the renowned Nexus Newsletter

Access the Nexus Community

Head over to Nexus Connect and see what’s new in the community. Don’t forget to check out the latest member-only events.

Go to Nexus Connect

Upgrade to Nexus Pro

Join Nexus Pro and get full access including invite-only member gatherings, access to the community chatroom Nexus Connect, networking opportunities, and deep dive essays.

Sign Up