this post was submitted on 26 Dec 2024
65 points (71.2% liked)
Technology
60115 readers
2717 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Bullshit. just reading this and comprehending it, which is thought, far exceeds 10 bits per second.
Speaking which is conveying thought, also far exceed 10 bits per second.
This piece is garbage.
There was a study in 2019 that analyzed 17 different spoken languages to analyze how languages with lower complexity rate (bits of information per syllable) tend to be spoken faster in a way that information rate is roughly the same across spoken languages, at roughly 39 bits per second.
Of course, it could be that the actual ideas and information in that speech is inefficiently encoded so that the actual bits of entropy are being communicated slower than 39 per second. I'm curious to know what the underlying Caltech paper linked says about language processing, since the press release describes deriving the 10 bits from studies analyzing how people read and write (as well as studies of people playing video games or solving Rubik's cubes). Are they including the additional overhead of processing that information into new knowledge or insights? Are they defining the entropy of human language with a higher implied compression ratio?
EDIT: I read the preprint, available here. It purports to measure externally measurable output of human behavior. That's an important limitation in that it's not trying to measure internal richness in unobserved thought.
So it analyzes people performing external tasks, including typing and speech with an assumed entropy of about 5 bits per English word. A 120 wpm typing speed therefore translates to 600 bits per minute, or 10 bits per second. A 160 wpm speaking speed translates to 13 bits/s.
The calculated bits of information are especially interesting for the other tasks (blindfolded Rubik's cube solving, memory contests).
It also explicitly cited the 39 bits/s study that I linked as being within the general range, because the actual meat of the paper is analyzing how the human brain brings 10^9 bits of sensory perception down 9 orders of magnitude. If it turns out to be 8.5 orders of magnitude, that doesn't really change the result.
There's also a whole section addressing criticisms of the 10 bit/s number. It argues that claims of photographic memory tend to actually break down into longer periods of study (e.g., 45 minute flyover of Rome to recognize and recreate 1000 buildings of 1000 architectural styles translates into 4 bits/s of memorization). And it argues that the human brain tends to trick itself into perceiving a much higher complexity that it is actually processing (known as "subjective inflation"), implicitly arguing that a lot of that is actually lossy compression that fills in fake details from what it assumes is consistent with the portions actually perceived, and that the observed bitrate from other experiments might not properly categorize the bits of entropy involved in less accurate shortcuts taken by the brain.
I still think visual processing seems to be faster than 10, but I'm now persuaded that it's within an order of magnitude.
Thanks for the link and breakdown.
It sounds like a better description of the estimated thinking speed would be 5-50 bits per second. And when summarizing capacity/capability, one generally uses a number near the top end. It makes far more sense to say we are capable of 50 bps but often use less, than to say we are only capable of 10 but sometimes do more than we are capable of doing. And the paper leans hard into 10 bps being a internally imposed limit rather than conditional, going as far as saying a neural-computer interface would be limited to this rate.
"Thinking speed" is also a poor description for input/output measurement, akin to calling a monitor's bitrate the computer's FLOPS.
Visual processing is multi-faceted. I definitely don't think all of vision can be reduced to 50bps, but maybe the serial part after the parallel bits have done stuff like detecting lines, arcs, textures, areas of contrast, etc.
You may be misunderstanding the bit measure here. It’s not ten bits of information, basically a single byte. It’s ten binary yes/no decisions to equal the evaluation of 1024 distinct possibilities.
The measure comes from information theory but it is easy to confuse it with other uses of ‘bits’.
What? This is the perfectly normal meaning of bits. 2^10 = 1024.
Only when you are framing it in terms of information entropy. I think many of those misunderstanding the study are thinking of bits as part of a standard byte. It’s a subtle distinction but that’s where I think the disconnect is
Yes, the study is probably fine, it's the article that fails to clarify before using it, that they are not talking about bits the way bits are normally understood.
I think we understand a computer can read this text far faster than any of us. That is not the same as conscious thought though- it’s simply following an algorithm of yes/no decisions.
I’m not arguing with anything here, just pointing out the difference in what CPUs do and what human brains do.
Right? They do nothing to expand upon what this plainly wrong claim is supposed to actually mean. Goddamn scientists need a marketing department of their own, because the media sphere in general sells their goods however the fuck they feel like packaging them.
You're misunderstanding the terminology used then.
In information theory, "bit" doesn't mean "bitrate" like you'd see in networks, but something closer to "compressed bitrate."
For example, let's say I build a computer that only computes small sums, where the input is two positive numbers from 0-127. However, this computer only understands spoken French, and it will ignore anything that's not a French number in that range. Information theory would say this machine receives 14 bits of information (two 7-bit numbers) and returns 8 bits. The extra processing of understanding French is waste and ignored for the purposes of calculating entropy.
The article also mentions that our brains take in billions of bits of sensory data, but that's ignored for the calculation because we only care about the thought process (the useful computation), not all of the overhead of the system.
Indeed it is. If you want to illustrate the point that silicon and copper are faster than bioelectric lumps of fat there are lots of ways to do this and it's no contest, but this is not a well done study.
Yes, fixed it. Only thinking at 8 bits a second this morning