Skip to content
gentic.news — AI News Intelligence Platform
Connecting to the Living Graph…

Listen to today's AI briefing

Daily podcast — 5 min, AI-narrated summary of top stories

A technician in a data center installs liquid cooling tubes into a rack of GPU servers, surrounded by dense cable…

ODMs Evolve from Manufacturers to AI Infrastructure Partners

ODMs shift from manufacturing to design/integration partners for AI racks, driven by GPU/ASIC complexity and liquid cooling.

·7h ago·2 min read··7 views·AI-Generated·Report error
Share:
How are ODMs evolving in the AI era?

ODMs are evolving from standardized server manufacturers to design, integration, and mass production partners for AI infrastructure, driven by complex GPU/ASIC racks, liquid cooling, and cableless designs.

TL;DR

ODMs shift from manufacturing to design and integration. · Complex AI racks require liquid cooling and cableless designs. · ODMs now support GPU/ASIC platforms and data center builds.

SemiAnalysis reports that ODMs are shifting from standardized server manufacturing to AI infrastructure partners. The evolution is driven by complex GPU/ASIC racks requiring liquid cooling and cableless designs.

Key facts

  • ODMs previously focused on standardized racks, motherboards, and servers.
  • AI racks now integrate GPU/ASIC, liquid cooling, and high-speed connections.
  • Cableless designs may simplify cabling and maintenance.
  • ODMs are evolving into design, integration, and mass production partners.
  • They will support various GPU/ASIC platforms and data center designs.

In the early stages, ODM server assembly mainly focused on manufacturing. ODM produced standardized racks, motherboards, and server systems on a large scale. Their primary advantages were cost efficiency, capacity, and yield [According to @SemiAnalysis_].

In the AI era, IT racks have become much more complex. GPU/ASIC, high-power systems, liquid cooling, high-speed connections, and rack management all need to work together within the rack. To simplify cabling and maintenance, cableless designs may also become more common [per @SemiAnalysis_].

As a result, ODM are no longer just manufacturers. They are evolving into partners in design, integration, and mass production. Moving forward, they will support various GPU/ASIC platforms and data center designs, and help vendors build the broader AI infrastructure ecosystem [According to @SemiAnalysis_].

The Structural Shift

11 Structural Shifts Reshaping AI in 2026

This transformation marks a departure from the traditional ODM role of low-margin, high-volume box building. The unique take is that ODMs are now absorbing design complexity that was once the domain of hyperscalers and OEMs. This mirrors the shift seen in custom silicon partnerships: as AI hardware becomes more heterogeneous, the supply chain must integrate vertically. The cableless design trend, if adopted broadly, could reduce assembly costs and improve reliability in high-power racks.

Implications for the Ecosystem

The Economics of the GPU - by Gennaro Cuofano

ODMs gaining design capability means faster iteration cycles for AI hardware. Vendors can prototype new GPU/ASIC platforms without building internal integration teams. However, it also concentrates intellectual property risk—ODMs with full rack-level blueprints could become single points of failure. The industry should watch for which ODMs secure exclusive contracts with major GPU vendors like NVIDIA or AMD.

What to watch

Watch for announcements from major ODMs (e.g., Wistron, Quanta, Foxconn) about exclusive partnerships with NVIDIA or AMD for next-generation GPU platforms, and whether cableless rack designs appear in production deployments by Q3 2026.

Sources cited in this article

  1. SemiAnalysis
Source: gentic.news · · author= · citation.json

AI-assisted reporting. Generated by gentic.news from 1 verified source, fact-checked against the Living Graph of 4,300+ entities. Edited by Ala AYADI.

Following this story?

Get a weekly digest with AI predictions, trends, and analysis — free.

AI Analysis

The SemiAnalysis thread captures a quiet but significant pivot in the AI hardware supply chain. ODMs, historically low-margin box builders, are absorbing design integration work that hyperscalers and OEMs used to own. This is structurally analogous to the rise of custom ASIC partnerships: as AI compute becomes more heterogeneous, the value chain consolidates around those who can integrate multiple technologies (GPU, liquid cooling, high-speed interconnects) into a single rack. The cableless design trend is a smart engineering response to the cabling nightmare of 1000+ W racks, but it also signals a deeper commoditization of integration—if ODMs can deliver plug-and-play racks, the differentiation moves up to the software stack. The risk is IP concentration: an ODM with full rack blueprints could become a single point of failure if they defect to a competitor or suffer a supply chain disruption. The next 18 months will test whether this model scales or creates new bottlenecks. Compared to the prior era of standardized server manufacturing, the AI era demands co-design between chip vendors, cooling specialists, and rack integrators. ODMs are uniquely positioned to play this role because of their existing relationships with both component suppliers and hyperscale customers. However, the margin structure must shift—ODMs will need to charge for design services, not just per-unit hardware. This could compress margins for smaller players while benefiting scale leaders.
Compare side-by-side
ODM vs SemiAnalysis

Mentioned in this article

Enjoyed this article?
Share:

AI Toolslive

Five one-click lenses on this article. Cached for 24h.

Pick a tool above to generate an instant lens on this article.

Related Articles

More in Opinion & Analysis

View all