Hacker News new | past | comments | ask | show | jobs | submit login

Looks like Google has better hardware for AI than anyone else, and they are the only company having both AI research and AI hardware. OpenAI+Microsoft and Anthropic+Amazon are much less integrated, working at arms length.

> Google will have multiple very large clusters across their infrastructure for training and by far the lowest cost per inference, but this won’t automatically grant them the keys to the kingdom. If the battle is just access to compute resources, Google would crush both OpenAI and Anthropic.

> Being “GPU-rich” alone does not mean the battle is over. Google will have multiple different clusters larger than their competitors, so they can afford to make mistakes with pretraining and trying more differing architectures. What OpenAI and Anthropic lack in compute, they have to make up in research efficiency, focus, and execution.

https://www.semianalysis.com/p/amazon-anthropic-poison-pill-...




> Looks like Google has better hardware for AI than anyone else, and they are the only company having both AI research and AI hardware

On top of that they have lots of training data: youtube, gmail, google docs, google drive, indexed www for google search, lots of data from those fancy cars driving around for google street project and probably still some data from google+


AWS has trainwave, which is less off the ground that TPUs but does exist.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: