On the "AI uses electricity" discourse:
I've seen instance after instance of people using ChatGPT and similar tools to optimize things like SQL queries, against absolutely enormous datasets, eliminating 60-90% of the runtime and therefore electricity use.
Does it make up for training and inference for *everyone*? Probably not, but these things scale hard. Imagine if a Windows developer is able to save a few thousand cycles on a commonly-called function in Windows.
The counterpoint is sth I haven't been able to find, but if anybody remembers it I'd love a link to it.
It was someone emailing Torvalds about saving a few cycles on sth and got a reply basically saying "If this ran thousands of times per second on every computer on earth, it would take thousands of years to make up for the cycles spent composing and sending this email. Don't sweat the small stuff."
@r000t
I don't mean you're wrong though — the company could hire less qualified developers to do the optimisations, and to a degree — they would still be successful (I doubt it would be in the 60%-90% ballpark on that scale — hundred of thousands of queries), but still — it might be not be as linear, and the final result could be less obvious.