GPT inputs outgrow world chip and electricity capacity
Badly editorialized title.
Regarding the substance of the article, the curve from the 3 data points (1: 100x 2: 25x 3: 25x) could fit lots of different ways besides "growing 30x per generation".
Really wish AMD could hurry up and implement a transparent module for gpu computing into llvm.
It’s not good that GPUs are this opaque.
The are all the same arguments that wrongly predicted that DNA sequencing would overtake hard drive storage capacity.
People aren't going to do truly uneconomic things just to scale language models exponentially.
Interesting Pull quote:
GPT-4 needed about 50 gigawatt-hours of energy to train. Using our scaling factor of 30x, we expect GPT-5 to need 1,500, GPT-6 to need 45,000, and GPT-7 to need 1.3 million.
New technologies experience very rapid exponential growth for a while.
This should not be a surprise.