• Investinq
  • Posts
  • 🥛 The hidden force beneath the servers ⚡️

🥛 The hidden force beneath the servers ⚡️

Your latest Milk Road AI Report.

Artificial intelligence has a problem, it can’t code its way out of electricity.

We’ve reached the point where models are getting smarter, faster than the grid can get stronger.

Every new server farm needs more watts than the last, and power demand is starting to push up against the physical limits of what our energy systems can deliver.

As we already know this boom has already become one of the biggest industrial expansions in modern history.

However, behind all the hype about chips and models, there’s a story unfolding, one about infrastructure, about power grids under strain, and about the companies that keep those systems alive.

And at the center of it all sits an unglamorous name that’s turned into a market monster: Vertiv Holdings.

In this report, we’ll explore how Vertiv became the backbone of the AI boom, the company wiring the grid, cooling the racks, and keeping the world’s servers alive.

Here’s what else we are cooking in today’s issue (besides the servers):

  • The electrical problem driving the AI infrastructure race.

  • The scale of energy demand pushing the grid to its limits.

  • How Vertiv’s engineering innovations are redefining data center design.

  • And why its financial momentum could make it one of the biggest long-term winners of the AI era.

THE COMING POWER CRUNCH

Let’s start with the numbers.

Data center energy consumption has reached a record 5% of total US power demand.

By 2030, that number could climb to 6.7–12%.

In absolute terms, that’s the equivalent of powering 24 million American homes just to keep servers humming.

The Department of Energy and Morgan Stanley have both been ringing the alarm, the U.S. could face a 36-gigawatt power shortfall by 2028 if data center growth continues on its current trajectory.

For context, that’s roughly what it takes to power the entire state of California.

Electricity prices have climbed relentlessly since the pandemic, up 23% since 2022 and 40% since 2020.

California, Maryland, and parts of the Midwest are already seeing localized grid stress as AI facilities come online.

The result is simple but unavoidable, AI’s hunger for compute is now colliding with real world physics.

THE DATA CENTER BUILD-OUT

The U.S. has entered a once in a generation construction wave but it’s not for office towers or housing.

It’s for data centers.

Spending on data centers has quadrupled since 2022, reaching an annualized $40 billion by mid-2025.

What used to be a niche industrial sector is now competing head-to-head with commercial real estate.

For the first time in history, the square footage being built for machines has caught up to what’s being built for people.

In 2021, office construction was seven times larger than data centers. Four years later, the lines have converged.

Globally, the numbers are even staggering. Analysts expect $500 billion in data center investment this year and close to $900 billion by 2028.

Hyperscalers, Amazon, Google, Meta, Microsoft are expected to pour $750 billion into infrastructure over just the next two years.

McKinsey believes total global investment could reach $6–8 trillion by the end of the decade.

We’re witnessing the digital equivalent of the industrial revolution and it’s only getting started.

WHY AI NEEDS SO MUCH POWER

Training and running AI models isn’t like spinning up a website.

Each new generation of hardware, H100s, B200s, and whatever comes next draws massive amounts of electricity and produces equally massive heat.

A single rack of AI servers can pull 20 to 30 kilowatts continuously. The new ones being planned for 2027 and beyond are expected to draw close to one megawatt per rack.

Multiply that by thousands of racks, and you’re looking at the power profile of a small city.

A brief voltage sag or outage can crash training runs that cost tens of millions of dollars.

That’s why data centers use multiple redundant power feeds, uninterruptible power supplies, and backup generators.

These are all designed to guarantee what the industry calls “five nines” of uptime, or 99.999% reliability.

The International Energy Agency estimates that data centers, led by AI workloads, will account for 20–40% of all new electricity demand worldwide this decade.

In the U.S. alone, their share of total power consumption will likely double.

The scale is unlike anything the grid has ever supported and it’s forcing the biggest tech companies on Earth to rethink how they interact with the energy system.

ENTER VERTIV

While everyone obsesses over GPUs, one company has become essential to keeping those GPUs alive.

Vertiv Holdings (NYSE: VRT) builds the heavy industrial infrastructure that makes modern AI possible: power distribution, backup systems, and high-capacity cooling.

You won’t find Vertiv demoing products on stage or trending on social media. Their work happens behind walls and under floors, in the electrical guts of data centers.

But that’s exactly what makes them indispensable.

If NVIDIA provides the brainpower, Vertiv provides the life support.

The company traces its roots back to Emerson Network Power, a legacy industrial business that specialized in mission critical electrical systems.

When Emerson separated, its Network Power division into a new, independent company, which became Vertiv.

Vertiv inherited decades of domain knowledge about how to keep massive IT operations alive through outages, surges, and blackouts.

Today, its customer list reads like a who’s who of Big Tech: Microsoft, Amazon Web Services, Google, Meta, and dozens of colocation giants.

Every one of them is building new facilities at a breakneck pace and Vertiv is supplying the plumbing.

REINVENTING HOW DATA CENTERS RUN

Traditional data centers pull alternating current (AC) from the grid and convert it multiple times before it reaches the chips as direct current (DC).

Each conversion step wastes energy as heat. That was fine when racks used a few kilowatts each. It doesn’t work when they use hundreds.

Vertiv is leading the shift toward 800-volt direct current architectures, built in collaboration with NVIDIA to power the next generation of AI racks.

This system eliminates several conversion stages, reduces copper requirements, and improves efficiency by double digits.

Instead of pushing AC through transformers, then converting it to DC at the rack level, Vertiv’s 800V DC approach delivers clean power straight to the servers.

Less waste. Less heat. More power density per square foot.

For the companies building billion dollar campuses, that’s not a small advantage, it’s the difference between profitable scaling and thermal failure.

DEALING WITH THE HEAT

Electricity is only half of the equation.

Everything that goes in eventually comes out as heat, and AI hardware runs hot enough to overwhelm traditional air cooling.

A normal data center rack might need to dissipate 10 to 15 kilowatts of heat. High end AI racks can exceed 250 kilowatts.

At that density, you can’t just blow cold air around, it’s like trying to cool a blast furnace with a desk fan.

Vertiv’s answer is industrial-grade liquid cooling. Their CoolChip CDU system can remove 2.3 megawatts of heatfrom a single module, circulating chilled coolant directly through heat exchangers attached to the chips themselves.

It’s one of the most powerful cooling systems commercially available.

For hybrid environments, Vertiv and Compass Datacenters co-developed CoolPhase Flex, a system that can dynamically switch between air and liquid cooling depending on the workload.

That allows hyperscalers to run mixed facilities, old CPUs next to new GPUs without rebuilding the entire infrastructure.

Amazon and Microsoft are both experimenting with in-house designs, but few operators have the bandwidth or expertise to engineer these systems from scratch.

Vertiv offers turnkey solutions and global service coverage, which most customers prefer over reinventing the wheel.

WHY THE BUSINESS IS SO STICKY

Vertiv’s moat isn’t about flashy patents or secret algorithms. It’s about reliability and inertia.

Once a data center is built with Vertiv’s systems, switching vendors becomes almost impossible.

The equipment runs 24/7, tied into safety, redundancy, and monitoring layers.

Replacing it means downtime and downtime costs millions.

That’s why most clients sign long-term maintenance contracts that provide steady, recurring revenue.

Services now make up more than 20% of Vertiv’s total sales, growing at double-digit rates, with over $90 million in deferred revenue already booked.

The company also has a physical advantage that rivals can’t easily replicate: a global network of more than 3,000 field engineers who can service installations anywhere in the world.

Smaller competitors often subcontract Vertiv technicians under white label deals just to meet customer requirements.

In a world where uptime is money, trust is the moat and Vertiv has built it over decades.

CHALLENGES ON THE HORIZON

Even great industrial stories have risks.

Tariffs and trade policy remain the biggest variable.

Many of Vertiv’s components fans, breakers, capacitorsare still sourced from Asia.

Tariffs have squeezed margins and forced the company to reconfigure its supply chain through Mexico and other USMCA-compliant hubs.

Even Vertiv admits the situation during their Q3 earnings and described it as“fluid and uncertain,”

Management expects to offset most of the cost impact by 2026, but trade politics could always throw another curveball.

Supply chains are another concern.

The surge in global demand for high-density cooling and power systems has stretched lead times for specialized components.

Vertiv has mitigated it with stockpiling and multiple suppliers, but bottlenecks remain a constant operational risk.

There’s also a strategic challenge brewing, the biggest cloud players, Amazon, Microsoft, and Google are experimenting with designing their own infrastructure.

Amazon’s new in-house cooling system, for example, could gradually replace some third-party solutions.

So far, though, those efforts are niche.

Hyperscalers are discovering that scaling custom hardware for hundreds of global campuses is far harder than writing code.

Financials and Valuation: How Much Is Vertiv Worth?

Vertiv’s financial performance has been nothing short of electric.

Vertiv’s stock is up 960% in the past five years and is up 50% for the year but the story is just getting started.

For full-year 2025, the company raised guidance to $10.2 billion in revenue (~27% growth) and about $4.10 in adjusted EPS, marking about a 45% increase from 2024.

Free cash flow is accelerating, too, management expects $1.5 billion in FCF this year, reflecting strong execution and operational discipline.

In Q3 alone, Vertiv generated $462 million in free cash flow and achieved an adjusted operating margin of 22.3%, underscoring the company’s improving efficiency and pricing power.

Vertiv’s balance sheet is in great shape, with net leverage down to 0.5× in Q3 and expected to fall to 0.2× by year end.

Its $9.5 billion order backlog up more than 60% year-on-year provides substantial visibility into 2026.

This combination of growth, margins, and financial flexibility has made Vertiv one of the standout industrial success stories of the AI era.

With the stock hovering around $180 per share, Vertiv’s market capitalization is roughly $68–70 billion. That valuation puts it squarely in premium territory.

The stock trades at roughly 34.6x forward 2026 earnings, multiples closer to fast growing tech firms than traditional industrials.

For comparison, Eaton trades around 27.7× forward earnings and Schneider Electric around 25.2×, meaning Vertiv’s premium is narrower than many think.

Importantly, Vertiv’s growth narrative is still intact.

In August 2025, the company completed its acquisition of Great Lakes Data Racks & Cabinets, a U.S.-based manufacturer of data center racks and integrated infrastructure solutions.

The deal, valued at approximately $200 million, strengthens Vertiv’s product lineup in modular infrastructure.

The acquisition complements Vertiv’s existing power and cooling systems and expands its footprint with colocation and enterprise customers.

So, what’s Vertiv worth?

Assuming continued AI infrastructure growth, Vertiv could double its revenue from ~$10 billion in 2025 to $20 billion by 2030, implying a mid-teens CAGR.

If operating margins rise to 24–25% (up modestly from the current 22.3%), net margins could reach ~18–20%, generating $3.5–$4 billion in annual net income by the decade’s end.

On a conservative 20× multiple, that implies a $70–80 billion valuation, roughly in line with today’s level but if Vertiv executes ahead of plan and continues to capture AI-driven demand, upside potential grows substantially.

For 2026, consensus analyst estimates project $5.19 in EPS, representing 26% growth from 2025.

Applying a 35Ă— multiple, consistent with similar high-growth infrastructure peer, yields a $182 price target, aligning closely with the current market level.

Under a bullish case, if Vertiv delivers closer to $5.50–$6.00 in EPS through stronger margins and backlog conversion, shares could reasonably reach $210–$220 within the next 12 months.

Looking out to 2030, if Vertiv achieves $25 billion in annual revenue and maintains 25% operating margins, net income could reach $4.5–$5 billion.

Even applying a moderate 20× multiple would yield a $240–$260 stock price, while a sustained growth premium at 25× earnings pushes that target toward $300+.

Our five-year base case projects a $300 share price, implying the stock could roughly double with a 15% compound annual return, achievable if AI infrastructure spending continues at today’s breakneck pace.

The biggest risk to that outlook is the cycle itself.

A slowdown in AI capital expenditures, tighter financial conditions, or disruptive new entrants in the power and cooling market could limit growth.

But if the AI buildout continues to scale globally, with data centers consuming ever more power which we think will happen, Vertiv remains a core picks and shovels winner, a company literally powering the future of intelligence.