Nvidia’s H200 is a next-level AI/HPC monster with a wallet-shredding $30 000 price tag. Linus gets his hands dirty tearing it down—unveiling beefy HBM3 stacks, a custom NVLink-like interconnect, and industrial-grade cooling—and then pits it against a dual-EPYC 9965 server and an RTX 5090. Spoiler: it absolutely demolishes them in generative-AI workloads.
If you’re running hyperscale data centers, this thing is candy; for the rest of us, the sticker shock is real. It’s an engineering marvel you’d drool over… until you peek at the invoice.
Watch on YouTube
Top comments (0)