the thing people don’t seem to get about LLMs is they aren’t “hallucinating” some percentage of the time, they are “hallucinating” 100% of the time, and if the randomly selected words they produce happen to match up with factual reality it’s entirely by coincidence.

human brains are not good at countenancing things like this.

the random word machine *seems* like it knows stuff! and it does seem like that quite a lot of the time! so long as you’re pushing it towards what we could call well tread territory where the random words it produces are probabilistically likely to be the correct answer because the training data contains a lot of non contradictory sources all saying the same obvious thing

Show thread

if you guess a number and roll a die it’s gonna be right 1/6th of the time and that’s pretty much enough to convince at least some humans that the die knows something

Show thread
Follow

@bri_seven I think it's called the magic eight ball

· Librem Social · 1 · 1 · 2
Sign in to participate in the conversation
Librem Social

Librem Social is an opt-in public network. Messages are shared under Creative Commons BY-SA 4.0 license terms. Policy.

Stay safe. Please abide by our code of conduct.

(Source code)

image/svg+xml Librem Chat image/svg+xml