"The brain represents information probabilistically, by coding and computing with
probability density functions, or approximations to probability density functions"
~ Knill and Pouget, The Bayesian Brain, 2004, Trends in Neuroscience
For all those saying #ChatGPT is just a mere parlor trick, at least some neuroscientists seem to think that we ourselves are performing a very similar parlor trick, except on a massively complicated scale.
@rachelwilliams The main problem I have with these kinds of models is that they scale terribly. My under standing is they scale at power 11, meaning that doubling in performance would require 2000 times the compute. If that is remotely true it would take about two decades for every doubling of performance. Considering that what the models are capable of is improving at a moderate pace, even with evergrater resources, it doesn't seem unreasonable that it might be true.
@rachelwilliams My current understanding is that quantum computers removes at best one power, that would be power ten or equivalent to a doubling every decade or so. Quantum computers has huge effect at power two where they can turn the scaling linear.
Regardless its a huge engineering effort.