top of page

ChatGPT brain.

Writer's picture: Alexandru VoicaAlexandru Voica

Startups, large tech companies and even governments are currently in a race to buy as many GPUs as possible. The hunt for AI hardware began last year when ChatGPT exploded in popularity, leading the entire tech industry to invest in building and deploying large language models and other related technologies.


However, these models require a lot of GPUs for training, leading to an industry-wide shortage in the supply of compute hardware. But why are GPUs so good for AI workloads? And how do they compare to CPUs?


Watch the video to find out!



2 views0 comments

Recent Posts

See All

"internet balkanization."

Over the last five years, the term “internet balkanization” has become common in discussions about the fragmentation of the global, open...

system thinking.

Today marks the one year anniversary since the launch of ChatGPT. While the underlying large language models powering chatbots like...

Comentários


bottom of page