NVIDIA's H100/B200 GPUs power the majority of the world's AI training infrastructure. Its CUDA platform and Blackwell architecture are the de facto standard for generative AI compute.
Data center density, GPU compute clusters, and AI facility buildout driving automation capability. NVIDIA's automation score of 51 reflects its R&D intensity ($8.7B annually) and AI patent portfolio of 4200 patents.
| # | Company | Automation Score | AI Patents | R&D ($B) |
|---|---|---|---|---|
| 1 | Microsoft AI | 5800 | $19.0B | |
| 2 | Amazon Web Services AI | 4100 | $16.0B | |
| 3 | Google DeepMind | 3800 | $12.0B | |
| 4 | Meta AI | 3200 | $14.0B | |
| 5 | NVIDIA | 4200 | $8.7B | |
| 6 | OpenAI | 870 | $7.0B | |
| 7 | Honeywell AI | 1900 | $1.5B | |
| 8 | xAI | 120 | $6.0B | |
| 9 | Anthropic | 340 | $4.0B | |
| 10 | Mistral AI | 45 | $1.1B |