Simon Smith’s Post

View profile for Simon Smith

EVP Generative AI at Klick

The cost of training top models is increasing 2-3X per year, and at this rate will reach $1 billion per model by 2027. The findings are from Epoch AI and follow their recent analysis of increases in compute for foundation models running at 4-5X per year. They've also provided some interesting data on the relative breakdown of cost components like staff, hardware, and energy. A few thoughts after reading this: - I would assume staff costs won't increase as much as hardware and energy costs, so the relative contribution of human labor to future AI models will decrease, in yet another sign of capital's increasing dominance over labor in the industry (just look at NVIDIA's ever-rising stock price). - If compute is increasing 4-5X per year but cost is increasing 2-3X, is it safe to assume that training runs are getting more efficient? - Some people, like Anthropic CEO Dario Amodei, have said we'll see $1 billion training runs in the next year. Does this mean the 2-3X pace is about to increase? That Epoch is wrong in its estimate? Or that Amodei is overstating the cost, possibly to put fear into competitors who can't keep pace? - Only a few companies are going to be able to train future models. We're likely going to see further consolidation. Perhaps we get to a future with only two large general purpose models, like the Windows/MacOS and Android/iOS duopolies we've seen in the past. Article here: https://lnkd.in/g5SejXen

  • No alternative text description for this image

To view or add a comment, sign in

Explore topics