Skip to content
gentic.news — AI News Intelligence Platform
Connecting to the Living Graph…

Listen to today's AI briefing

Daily podcast — 5 min, AI-narrated summary of top stories

NVIDIA Feynman GPU Power Semi Content Hits $191K, 17× Blackwell

NVIDIA Feynman GPU Power Semi Content Hits $191K, 17× Blackwell

NVIDIA Feynman GPUs require $191K in power semiconductors per system, 17× Blackwell, driven by 800V DC architecture shift.

·4h ago·3 min read··4 views·AI-Generated·Report error
Share:
Source: news.google.comvia gn_gpu_clusterCorroborated
What is the power semiconductor cost per system for NVIDIA's Feynman GPUs?

NVIDIA's Feynman GPU generation pushes power semiconductor content to $191,000 per system, a 17× increase over the Blackwell architecture, driven by industry shift to 800V DC power delivery.

TL;DR

Feynman GPU power semi content reaches $191,000 per system. · 17× increase over Blackwell GPU architecture. · 800V DC architectures drive higher power semiconductor costs.

NVIDIA's Feynman GPU generation pushes power semiconductor content to $191,000 per system, a 17× increase over the Blackwell architecture. The leap reflects the industry's shift to 800V DC power delivery, which demands more expensive silicon carbide (SiC) and gallium nitride (GaN) components.

Key facts

  • Feynman power semiconductor content: $191,000 per system.
  • 17× increase over Blackwell's ~$11,000 power semi content.
  • 800V DC architecture requires SiC and GaN components.
  • Google plans $190B AI infrastructure buildout.
  • NVIDIA's China market share dropped to zero in 2026.

Wccftech reports that NVIDIA's upcoming Feynman GPU architecture will require $191,000 worth of power semiconductors per system, up from roughly $11,000 for Blackwell [According to Wccftech]. The 17× increase is not a GPU cost hike — it reflects the move to 800V DC architectures, which demand silicon carbide (SiC) and gallium nitride (GaN) power devices rather than traditional silicon MOSFETs.

Why 800V DC Drives the Cost

Delivering Massive Performance Leaps for Mixture of Experts Inference ...

The 800V DC architecture shift is the primary driver behind the 17× cost increase. Higher voltage reduces current for the same power level, slashing I²R losses in datacenter distribution. But it requires power semiconductors with higher breakdown voltages — SiC and GaN components that cost 5–10× more per watt than silicon equivalents [Industry estimates]. Each Feynman rack likely contains dozens of 1,200V-class SiC MOSFETs and GaN HEMTs for DC-DC conversion and bus regulation.

The Unique Take

This cost increase is not inflationary — it's substitutional. The $191,000 figure represents power delivery, not compute. As hyperscalers like Google plan $190B AI infrastructure buildouts [According to TradingView], the 800V DC standard is becoming the default for next-generation datacenters. Feynman's power semi content is a leading indicator: the cost of moving electrons is now a first-order design constraint for AI clusters.

Industry Context

NVIDIA Blackwell Leads on SemiAnalysis InferenceMAX v1 Benchmarks ...

NVIDIA's China market share has dropped to zero due to US export controls [Per Nvidia's Jensen Huang, May 2026]. The company invested $2 billion in Marvell to deepen the NVLink Fusion partnership [Previously reported]. Feynman's power architecture suggests NVIDIA expects hyperscaler demand to absorb the higher system cost — each Feynman system at $191K in power semi content alone implies total system prices well above $1 million.

Google's $190B AI buildout, reported by TradingView, signals demand for the 800V DC systems that Feynman targets. The shift also creates opportunities for power semiconductor suppliers: Wolfspeed, Infineon, and ON Semiconductor are the primary SiC/GaN vendors positioned to supply the Feynman generation.

Wccftech notes the figure is based on supply chain estimates, and NVIDIA has not confirmed per-system BOM breakdowns.

What to watch

Watch for NVIDIA's official Feynman launch event, expected late 2026 or early 2027, which should confirm system pricing and power architecture details. Also track Wolfspeed and Infineon earnings for SiC order volumes that would validate the $191K figure.


Sources cited in this article

  1. Wccftech
  2. TradingView
  3. Nvidia's Jensen Huang
  4. Previously
Source: gentic.news · · author= · citation.json

AI-assisted reporting. Generated by gentic.news from 4 verified sources, fact-checked against the Living Graph of 4,300+ entities. Edited by Ala AYADI.

Following this story?

Get a weekly digest with AI predictions, trends, and analysis — free.

AI Analysis

The $191,000 power semiconductor figure is a structural signal about AI infrastructure economics. The 17× increase over Blackwell is not GPU cost inflation — it's a power architecture transition. As hyperscalers adopt 800V DC to improve datacenter efficiency, the cost of power delivery components becomes a material fraction of total system cost. This shift favors SiC/GaN suppliers (Wolfspeed, Infineon, ON Semi) over traditional silicon power vendors. It also suggests NVIDIA expects hyperscaler demand to absorb significantly higher system prices — each Feynman system at $191K in power semi content implies total system costs well above $1 million. The 800V DC trend is not NVIDIA-specific. Google's $190B AI buildout, combined with OpenAI's projected $121B hardware costs for 2028, indicates the entire hyperscaler ecosystem is moving toward higher-voltage DC distribution. Feynman's power architecture is an early indicator of a broader industry shift. The confidence is moderate (0.65) because Wccftech is a single source for the $191K figure, and NVIDIA has not confirmed BOM details. The 17× comparison to Blackwell is plausible given the SiC/GaN cost premium, but exact numbers should be treated as directional.
Compare side-by-side
Google vs Nvidia
Enjoyed this article?
Share:

AI Toolslive

Five one-click lenses on this article. Cached for 24h.

Pick a tool above to generate an instant lens on this article.

Related Articles

More in Products & Launches

View all