Discussion about this post

User's avatar
Charles Yang's avatar

On a slightly unrelated note: NVIDIA recently published a blog post about using AI to design parts of the H100(https://twitter.com/rjrshr/status/1545446397759016962). They're suggesting that AI can beat out EDA tools in many cases. Do you think that's plausible? Do you see EDA firms starting to integrate AI capabilities into their software or is there space for disruption here?

Expand full comment
Arunabh Saxena's avatar

Hey Dylan

Thanks for the comparison.

Curious about some points that occurred to me after reading the article:

(a) Is it the case that Google's TPUs are useful only for a select class of AI models? Is that why Google still buys thousands of Nvidia GPUs?

(b) From the perspective of a GCP customer, Google recently claimed on their blog that TPUs would reduce their costs by 35-40% (they compared TPUs on GCP to A-100s on Azure). So is Nvidia's good software stack the only reason that customers at GCP are not adopting TPUs en masse?

(c) Surprised that except Google, no one thinks it is worth the effort to create their own silicon for training & inference? Keep hearing about AWS's Graviton but nothing more & it seems to be a CPU chip, not a GPU.

(d) How is Nvidia so confident that GPUs will be the dominant hardware form-factor going ahead? I understand models are changing very fast at the cutting edge, but wouldn't some kind of ASIC/FPGA kind of hardware give lower TCO?

Sorry for so many queries but I am just so surprised at how dominant Nvidia has stayed over the last 5-6 years, and how no one seems to be posing any effective threat.

Expand full comment
14 more comments...

No posts