"The brain represents information probabilistically, by coding and computing with
probability density functions, or approximations to probability density functions"

~ Knill and Pouget, The Bayesian Brain, 2004, Trends in Neuroscience

For all those saying #ChatGPT is just a mere parlor trick, at least some neuroscientists seem to think that we ourselves are performing a very similar parlor trick, except on a massively complicated scale.

#AI #artificialintelligence #philosophy #neuroscience

Follow

@rachelwilliams The main problem I have with these kinds of models is that they scale terribly. My under standing is they scale at power 11, meaning that doubling in performance would require 2000 times the compute. If that is remotely true it would take about two decades for every doubling of performance. Considering that what the models are capable of is improving at a moderate pace, even with evergrater resources, it doesn't seem unreasonable that it might be true.

· Librem Social · 1 · 0 · 0

@ekg do you think that, theoretically, quantum computers could solve the compute scaling problem?

@rachelwilliams My current understanding is that quantum computers removes at best one power, that would be power ten or equivalent to a doubling every decade or so. Quantum computers has huge effect at power two where they can turn the scaling linear.

Regardless its a huge engineering effort.

Sign in to participate in the conversation
Librem Social

Librem Social is an opt-in public network. Messages are shared under Creative Commons BY-SA 4.0 license terms. Policy.

Stay safe. Please abide by our code of conduct.

(Source code)

image/svg+xml Librem Chat image/svg+xml