Startups, large tech companies and even governments are currently in a race to buy as many GPUs as possible. The hunt for AI hardware began last year when ChatGPT exploded in popularity, leading the entire tech industry to invest in building and deploying large language models and other related technologies.
However, these models require a lot of GPUs for training, leading to an industry-wide shortage in the supply of compute hardware. But why are GPUs so good for AI workloads? And how do they compare to CPUs?
Watch the video to find out!
Comments