How do we perceive numbers


_ 00:00:00 _ Hey smart people, Joe here. I want to do a little experiment on you. I'm going to flash some images and I want to see if you can tell how many dots are in those images. Image: a single black dot on a white background Okay, and another one. Image: two black dots One more. Image: three black dots Okay, and another. Image: five black dots There's a really interesting surprise hidden in this test. For one, two, or three dots, I bet you were able to count them pretty much instantly without even thinking about it, maybe even for four. But five starts to get a little bit harder, and once we get to eight, there's just not enough time to process the dots. Just kidding, there are actually only seven dots in that last one. Image: seven black dots Alright, try this. Image: A mix of yellow and blue stars You can count the number of each color star almost instantly, but counting the total number of stars takes way more concentration, or you're not as accurate. Psychologists have done these tests on huge numbers of people going back more than a century ago, and they find it takes people basically the same amount of time to recognize one, two, or three dots or shapes or objects, but beyond three, we very quickly answer slower, we start to make mistakes. Why do scientists care about having people count dots? Well, there's a surprising and profound secret hiding in these little experiments, hints of a strange way that we count and think about numbers deep in some unconscious part of our brains. We have a weird, ancient way of representing the muchness of something that goes way back in our evolution, before we ever wrote numbers down, even before humans existed. And it can explain this. _Image: Roman numerals I, II, III, IV, V]
_ 00:01:44 _ There's something pretty weird about Roman numerals. The first three numerals make sense, they're just tallying up ones. The sticks and bones show that ancient humans were using tally marks to keep track of numbers as far back as 40,000 years ago. But why is four written as this instead of this? At four, instead of just tallying up the marks, suddenly we have to do subtraction, take one away from five. It's not exactly difficult, but it's weird. Well, maybe abandoning tally marks after three was just a Roman thing. If that's true, then how do you explain this? Image: Chinese and Arabic numerals 1, 2, 3, 4 Huh, they stop straight tally marks after three or four, too. Well, surely our number system doesn't follow this weird pattern, right? Well, one is obviously a tally mark, right? But two and three, well, you might be surprised to learn that those actually derive from two or three horizontal bars that became tied together when written by hand. It can't be a coincidence that all these distinct cultures somehow came to this same conclusion. They all switch from tallies to symbols after three or four in order to be quickly understood by our brains. This is a hint that there's something weird and special about how our brains recognize and represent numbers. We're going to explore a bunch of weird things that you don't realize that you do when you think about numbers and then discover a pretty surprising answer to why your brain does this. It seems like for a few objects, we don't actually have to count them one by one, we just see them all at once without counting. But for four or more objects, we can't quickly and reliably name the number that we see. Things start to get approximate.
_ 00:03:31 _ Okay, try to decide which of these groups has more dots. Image: two groups of dots, one with 5 and one with 10 Now do the same thing for these two. Image: two groups of dots, one with 20 and one with 25 You can more easily distinguish two distant numbers than two closer numbers. Alright, that's not too surprising. Now do the same thing with these two sets of dots, which has more? Image: two groups of dots, one with 10 and one with 20 And what about these two sets of dots? Image: two groups of dots, one with 90 and one with 100 That one was much easier, wasn't it? Yet both of those pairs differ by the exact same amount: 10 dots. Even though the difference between them was objectively the same, we have a harder time distinguishing two larger quantities. We invented calculus, but we can't tell 90 from 100 dots. I mean, a computer could do that easy, but not us.
_ 00:04:19 _ Okay, comparing dots is hard, maybe. Surely we're real pros when it comes to comparing actual numbers. I mean, we see digits everywhere every day. I don't look at dots all day. Here's another experiment: give people two digits that are far apart, and they can quickly figure out which one is bigger. But when the digits are close together, people take longer to figure out which is bigger. It gets weirder. It takes us longer to compare larger numbers versus smaller ones, even if they're only one apart. Comparing three and two takes less time than four and five, which takes less time than eight and nine. Researchers have tested tons of people on these things, and essentially everyone takes longer to compare larger numbers or numbers that are closer together. We aren't consciously aware we're slower, but the computers don't lie. It happens with bigger numbers, too. When people are asked whether a two-digit number is larger or smaller than 65, the closer the number is to 65, it takes more time to answer. This isn't what we'd expect if we were just comparing the number symbols or if we were computing the difference with subtraction or something. It's as if when we see any number greater than three, we automatically translate them into some sort of quantity, and this is what leads to confusion when comparing numbers.
_ 00:05:44 _ Here's a very weird question: what number subjectively feels closer to 10, 9 or 11? Most people answer that 11 feels closer, even though they both differ by one. If you don't believe me on that one, which of these numbers feel closer together: 9 and 10 or 99 and 100? Same objective difference, but most people answer that the larger numbers feel closer. If you ask lots of people to select numbers at random between 1 and 50, we tend to pick smaller numbers more than larger ones, which is definitely not random. It's as if we have a jar of numbers in our head where small numbers are over-represented, and we can demonstrate this small number bias in a different way, too. If I give you two different series of numbers, which one looks like it most evenly covers the range between 1 and 2,000? Image: two series of numbers are shown Take a look at them for a moment. Most people respond that this series is more evenly spread out and more random. The other series has way too many large numbers, right? Actually, the top series is the one which more evenly samples the range. The second series just feels better to us because smaller numbers feel farther apart, and we kind of compress large numbers together. It's almost like we think logarithmically.
_ 00:07:30 _ It's about to get even weirder. Let's do another experiment. Your task is to determine whether a number is larger or smaller than 55. Hit the button in your right hand if it's larger and the button in your left if it's smaller. Visual simulation of the experiment with numbers appearing on screen Okay, smaller, bigger, that one's bigger, smaller again, bigger again, that one's smaller, that one's smaller, bigger. But what happens if I switch the buttons, so now you hit the right button if it's smaller and the left button if it's bigger? Well, people answer significantly faster if right means larger. It's as if you have a literal number line in your head, because that's how we write the number line, bigger is to the right. The conclusion is obvious: when you compare two numbers, you aren't computing them or subtracting them, you are physically comparing them in space. Okay, well, maybe that's because most people are right-handed. Except left-handers show the same right-equals-larger reflex. And when people cross their hands, they still answer larger faster for whatever hand is physically on the right side of their body. That's your right. But is this because basically from birth we're programmed everywhere we look to think of bigger numbers on the right? And guess what? People from cultures where they learn to read right to left show the opposite. They tend to associate larger numbers with the left because their mental number line goes the opposite way.
_ 00:09:08 _ This is really weird. Of course, humans have two ways of thinking about numbers. We have specific words and language and symbols we use, like three or seven or 99. But humans have only been using these words and symbols for a few thousand years at the most. And psychologists have gathered a lot of evidence that we have other, more ancient, abstract ways of thinking about numerical quantities, this "number sense" that sort of hides under the surface and pokes through in some weird ways. We know this abstract number sense is pretty ancient because cute little babies have shown that they have an innate sense of numbers long before they recognize words or symbols. And even other animals can count and keep track of the muchness of things. For a long time, people thought babies were born, well, totally dumb, that their brain was like a blank page, and they only figured out that objects exist and how they interact with each other by observing their environment. Now, babies are not smart, and I'm allowed to say that because I've raised two of them. But newer science has shown that even at birth, they have some kind of idea of numbers. Like, if I put these two rows of M&M's in front of a 2-year-old, they're going to pick this row every time. I mean, they're not idiots.
_ 00:10:32 _ If you show an infant a bunch of cards with two objects and then suddenly throw in a card with three objects, they just stare at the one with three, like, "whoa, something's clearly different here." Infants can even do addition. Animation: A toy is placed on a stage, then a screen comes down. A hand places a second toy behind the screen. Show a four or five-month-old a toy, then hide it. Now move another toy behind that hidden area. If you reveal two toys behind the card, baby's like, "this is all good, nothing's amiss here." But do some trickery and reveal just one toy, and they're like, "whoa, hold on a sec, what tomfoolery is afoot here, Mrs. Scientist?" Babies clearly understand that 1 + 1 doesn't equal 1. It's not exactly linear algebra, but we clearly have an innate sense of quantity before we can speak or read numbers or go pee-pee in the potty.
_ 00:11:32 _ We even see hints of number sense in other animals, too. But strangely, these other species show some similar patterns to what we see in humans. In a test like this, quickly quantifying one to four objects but becoming slower and less accurate for more objects. Image: A monkey choosing between two containers of apple slices Given two choices, wild monkeys will choose the container containing more apple slices as long as each container contains fewer than five pieces. Huh. Schooling fish will always join a bigger group with three rather than two fish, or four rather than three. But to choose the larger school beyond that, the difference has to be eight versus four fish, or 16 versus 8 fish, as if they're using one way of counting small numbers and a different, more fuzzy way of quantifying large numbers, just like we do.
_ 00:12:26 _ Scientists believe humans and other animals share two kinds of innate number senses. One is used to represent exact quantities from 1 to four. This might be why crows can count out up to four caws in response to different prompts, or why even simple-brained animals like bees can count up to four landmarks while navigating to food sources. The other number sense is more approximate. It's better at estimating the differences in quantities. This is why a chimpanzee will only attack an intruder if they outnumber him 3 to one or more, or why most animals can judge which choice has more food, where the most predators are, or where the least rival mates are. In cases like these, fuzzier estimates of magnitude are often good enough. Scientists are still just starting to scratch the surface when it comes to understanding how universal this innate number sense really is among other animals, but they are starting to understand that some parts of animal brains contain "number neurons" that are specially tuned to fire strongly in response to one number, and other parts of the brain that are used to compare the size of objects or that let us move in space, they are also used when we compare approximate quantities. These regions of the brain are connected to many other senses, which is why counting is something we can do not just for what we see, but for what we hear or what we touch, too.
_ 00:13:58 _ Understanding where our innate number sense comes from is important because 3 to 7% of people suffer from a learning disorder called dyscalculia, which prevents them from understanding numbers and doing math, which can obviously be a big hurdle in their future. But it is humans who invented a language and series of precise symbols to represent numbers. And once we learn how to use them, those symbols let us deal with precise numbers, small or large or gigantic ones even, and combine and manipulate and measure them in complex ways with little to no fuzziness or approximation. That very same symbolic precision has allowed us to do all of this science that lets us understand all these hidden and innate number senses that we aren't conscious of. Symbolic language has even taught us why we have number symbols in the first place, or to put it another way, what numbers are even for. Stay curious.