On the "AI uses electricity" discourse:

I've seen instance after instance of people using ChatGPT and similar tools to optimize things like SQL queries, against absolutely enormous datasets, eliminating 60-90% of the runtime and therefore electricity use.

Does it make up for training and inference for *everyone*? Probably not, but these things scale hard. Imagine if a Windows developer is able to save a few thousand cycles on a commonly-called function in Windows.

The counterpoint is sth I haven't been able to find, but if anybody remembers it I'd love a link to it.

It was someone emailing Torvalds about saving a few cycles on sth and got a reply basically saying "If this ran thousands of times per second on every computer on earth, it would take thousands of years to make up for the cycles spent composing and sending this email. Don't sweat the small stuff."

Show thread
Follow

@r000t
True, but — saving 60%-90% time on queries running hundreds of times per day? They should and would have done it even without chatGPT long time ago, that would have saved their company generous sums on hardware, especially considering lots of business processes still aren't properly scalable and can't be parallelised and distributed over as many nodes as possible — they have probably had to resort to buying more and more expensive hardware every time the load increased noticeably.

@r000t
I don't mean you're wrong though — the company could hire less qualified developers to do the optimisations, and to a degree — they would still be successful (I doubt it would be in the 60%-90% ballpark on that scale — hundred of thousands of queries), but still — it might be not be as linear, and the final result could be less obvious.

@m0xee One of the examples is an MS SQL developer who doesn't necessarily *administer* databases so doesn't understand how the actual database engine does shit

Meanwhile, the language model they're sending "hey can you fix this one error" to has read every StackOverflow post and will naturally "just btw while we're here" and/or "oh god why are you using this well-known antipattern?" and that's where these amazing savings are coming from

@r000t
I agree with you on the general principle, what I doubt is — how it works on that scale. I have to admit: I did have to deal with MS SQL — no company names of course, but it was critical. And what we did over the weekend on introducing performance enhancements is — we poured n-last-days of data from production into it and we were reproducing the most likely scenarios from production database with three most recent releases of MS SQL with different versions of our .DLL-shims…

@r000t
Which were supposed to introduce further performance enhancements, but not under all conditions — we did all that shit and at times we still had to resort to buying more expensive hardware — WAY more expensive! It was as far as possible from "This just won't work!" coming from some seemingly smart kid, knowledgeable about databases — we did real testing!

@r000t
No developer — even godlike, is knowledgeable enough how it would work in real-life conditions, but neither is LLM.
True — I might be wrong and a lot of DB-devs never go through such thorough trials. On the other hand, I doubt said devs ever deal with such critical infrastructure. Again, I might be wrong in general and it's all amateur bullshit at this point 🤷

Sign in to participate in the conversation
Librem Social

Librem Social is an opt-in public network. Messages are shared under Creative Commons BY-SA 4.0 license terms. Policy.

Stay safe. Please abide by our code of conduct.

(Source code)

image/svg+xml Librem Chat image/svg+xml