On the "AI uses electricity" discourse:

I've seen instance after instance of people using ChatGPT and similar tools to optimize things like SQL queries, against absolutely enormous datasets, eliminating 60-90% of the runtime and therefore electricity use.

Does it make up for training and inference for *everyone*? Probably not, but these things scale hard. Imagine if a Windows developer is able to save a few thousand cycles on a commonly-called function in Windows.

The counterpoint is sth I haven't been able to find, but if anybody remembers it I'd love a link to it.

It was someone emailing Torvalds about saving a few cycles on sth and got a reply basically saying "If this ran thousands of times per second on every computer on earth, it would take thousands of years to make up for the cycles spent composing and sending this email. Don't sweat the small stuff."

Show thread

@r000t
True, but — saving 60%-90% time on queries running hundreds of times per day? They should and would have done it even without chatGPT long time ago, that would have saved their company generous sums on hardware, especially considering lots of business processes still aren't properly scalable and can't be parallelised and distributed over as many nodes as possible — they have probably had to resort to buying more and more expensive hardware every time the load increased noticeably.

Follow

@r000t
I don't mean you're wrong though — the company could hire less qualified developers to do the optimisations, and to a degree — they would still be successful (I doubt it would be in the 60%-90% ballpark on that scale — hundred of thousands of queries), but still — it might be not be as linear, and the final result could be less obvious.

Sign in to participate in the conversation
Librem Social

Librem Social is an opt-in public network. Messages are shared under Creative Commons BY-SA 4.0 license terms. Policy.

Stay safe. Please abide by our code of conduct.

(Source code)

image/svg+xml Librem Chat image/svg+xml