Just when we thought we were safe, ChatGPT is coming for our graphics cards

Everyone seems to be talking about ChatGPT nowadays thanks to Microsoft Bing, but given the nature of large language models (LLMs), a gamer would be forgiven if they feel a certain déjà vu.

See, even though LLMs run on huge cloud servers, they use special GPUs to do all the training they need to run. Usually, this means feeding a downright obscene amount of data through neural networks running on an array of GPUs with sophisticated tensor cores, and not only does this require a lot of power, but it also requires a lot of actual GPUs to do at scale.

Source: www.techradar.com