Oh, Really? How Our Brain Turns Pitch Into Meaning
10:46 minutes
Changing the pitch of a word can alter the entire meaning of a sentence. “Mark went on vacation,” versus “Mark went on vacation,” convey different ideas. A group of researchers studied how the neurons in our brain picked up on these differences in pitch. The results were published this week in the journal Science.
[Scientists are experimenting with methods of conveying data audibly.]
Neurosurgeon Edward Chang, one of the authors of the study, discusses how our brains make sense of pitch, intonation, and background noise to discern meaning from spoken language.
Edward Chang is a neurosurgeon and professor of neurosurgery at the University of California, San Francisco.
IRA FLATOW: This is Science Friday. I’m Ira Flatow. During the 45 years or so that I’ve been on the radio, I have read tons of text, and I have learned that emphasizing certain words can give a sentence an entirely different meaning. Let me give you a couple of examples.
What are you doing this weekend conveys something different from what are you doing this weekend or really, really– same words, different idea. We all do this all the time, but how does our brain make sense of this?
Researchers wondered about this, too, so they looked into this idea, and their results were published this week in the journal Science. Edward Chang is one of the authors on that study. He’s a neurosurgeon and professor of neurosurgery at the University of California, San Francisco. Welcome to Science Friday, Dr. Chang.
EDWARD CHANG: Hi, Ira. Great to be here. It’s great to be here.
IRA FLATOW: You’re welcome. There are so many different components to spoken language. Why is pitch interesting to study in terms of the brain and language?
EDWARD CHANG: Well, we’re making a lot of progress trying to understand how the brain processes this really unique ability to understand speech and language. And, in the last couple of years, I think we’ve made inroads into understanding how things like consonants and vowels that compose words are being processed.
But pitch actually represents a really important extra dimension that happens pretty much simultaneously when the consonants and vowels are uttered, and it’s really about how we say things more than the exact vowels and consonants themselves. As you pointed out, that’s really important because it can change the meaning even without changing the words themselves.
IRA FLATOW: Do we have specialized neurons in our brain that can pick out the different pitches and give the meaning to that?
EDWARD CHANG: Well, that was one of the key questions that we wanted to answer. It could have turned out a variety of different ways. For example, if the point of emphasizing these sounds with raising and modulating our pitches to emphasize them, you could imagine that we might have seen that when you stress a word in a sentence, for example, the brain’s response to those consonants and vowels in that word would be increased, for example.
But what we found was actually something quite different. We found that the groups of brain cells that are called neurons were dedicated just for the processing of pitch, and these brain cells didn’t actually care much about the particular words that were being spoken. They were just tuned to the pitch changes.
IRA FLATOW: That’s amazing. You must have been surprised by that.
EDWARD CHANG: Oh, yeah. We were very surprised, and it was interesting to see that some of them were tuned to pitches that changed downward. Some were tuned to pitch changes that were upward in direction. Some were tuned to even changes that were up and then down. So it seems like there really is this specialized population of cells in the brain that are dedicated to this particular job of taking pitch out of the speech signal.
IRA FLATOW: And it has that process this in real time, right– at the exact moment it’s being done. I mean, that sounds amazing also.
EDWARD CHANG: Yeah. I think that that is amazing, and it’s happening as you and I are talking right now. What’s, I think, equally interesting and amazing about it is that the signal– all of these cues, like the consonants and vowels and the pitch information– they’re all more or less simultaneous in the speech signal. They’re all overlapping.
And what was, I think, pretty interesting to us to see was that we found these clusters of neurons that were tuned to pitch, but just next door were a totally separate set of neurons that were tuned to the consonants and vowels themselves. So they were intermixed. And what was amazing to see was how one signal– these words that were coming into the ear– the information in those words were actually dissected into different components in, essentially, different channels in the brain.
IRA FLATOW: That’s fascinating. We know that there are languages, like Chinese, that are tonal, and they’re based on pitch. Does that mean a Chinese speaker may have a different brain map than an English speaker?
EDWARD CHANG: Well, I think that’s a really interesting question, and we’d love to look at that question. It’s something that we’re looking at in terms of ongoing and future experiments.
There’s a really high chance that someone who is used to listening to a tonal language, who’s inexperienced in something like Mandarin, is processing tonal pitch, actually, in the same way. What’s interesting about it is that even though they may be processing it in a similar way in the part of the brain that processes sounds, the way they interpret those sounds may be quite different.
So, in Mandarin, for example, those pitch changes are very important cues to actually change the meaning of the word directly, whereas, in English, it’s a little bit more nuanced. It changes the meaning primarily in its context of a sentence, and it’s more nuanced.
IRA FLATOW: Mm-hmm. What about comparing it to animals? When you have a dog, and you say good dog, bad dog, it sort of knows from the tone of your voice.
EDWARD CHANG: Yeah, I think so. I think that there is a chance that, actually, this may not even be specific to humans. There’s a chance that processing these kinds of changes in a particular sound feature like pitch could actually be something that’s conserved across different species. Yeah.
IRA FLATOW: Let me go to the phones because there are lots of people who’d like to talk about it. Let’s go to Brett in Washington D.C. Hi, Brett.
SPEAKER 1: Hi, Ira. My question– a big fan of your show– my question is with regards to pitch, is there a difference how men and women react to pitch, and is one sex hear more pitches than the other.
EDWARD CHANG: So I think that that’s a really interesting question. I just want to say that we didn’t look at how men and women responded to pitch itself. But there is something of real direct relevance to actually how we hear women and men’s voices.
So, in general, a female voice has higher pitch, and a male voice, in general, is lower pitch. This just happens to be related to the size of the vocal tract. And that presents sort of an interesting problem because if pitch information is used as a cue to like– let’s say, if the speaker is a male or female– how is it that that same cue can be used for things like intonation, changing the pitch, and when you’re making a stress in a word?
And that turned out to be a really interesting second finding in the paper– was that the brain cells that were encoding the information about speech intonation, kind of like when we’re stressing words– they didn’t really care about the actual pitch itself, which is what we call the absolute pitch. What they really cared about was the change in pitch.
So they didn’t really care about, actually, if it was coming from a female or a male voice. What they really cared about was the pitch change.
IRA FLATOW: That’s interesting. Let me see if I can get one more– let me stay in D.C. and get one more call in from Gabe. Hi, Gabe.
SPEAKER 2: Hi, Ira. Thanks for taking my call. A quick question about people with developmental disabilities. I know autistic people kind of sense things differently. Is pitch something that they also sense differently, or have you even studied that?
EDWARD CHANG: We haven’t studied that in our particular studies, but my colleagues who do study autism and kids with autism– there’s no question that pitch processing, intonation processing, how they perceive these kind of cues are different in that context.
And I think that this is– our hope is that with this new knowledge that we can think about how one might be able to actually improve someone’s ability to hear these kind of critical changes in speech.
IRA FLATOW: Here’s a tweet from Annie B., who says, “Growing up, my dad always told us it’s not what you say, but how you say it.” Does that ring true to this case?
EDWARD CHANG: Well, I think in the context of this project, yeah. I mean, that was sort of like our working model and our motto as we thought about the significance of it.
The reality is, actually, as listeners, we use all of these cues. We use the consonants, vowels– of course– and this pitch information as well.
But it is an interesting dimension, and, in some cases, it’s intuitive to many people about how important it is because, as you alluded to in the beginning, when you have text, the only way we can really convey this kind of information is by italicizing or punctuation marks, et cetera. And so it’s an important component of how we communicate.
IRA FLATOW: One last quick question. Now that we all communicate with our fingertips– texting– are we losing all that context?
EDWARD CHANG: We are. It helps to have emoticons with that and other images that convey excitement or depression, et cetera. But we are losing some of that.
But, on the other hand, we gain some things from that as well. So I think that language is one of these things that continues to evolve, and we’ll probably see something related to texting soon.
IRA FLATOW: We’ll see you– we’ll have you back to talk about it, Ed.
EDWARD CHANG: Great.
IRA FLATOW: Edward Chang, a neurosurgeon and professor of neurosurgery at the University of California, San Francisco. Thank you for taking time to be with us today.
EDWARD CHANG: Thank you.
Copyright © 2017 Science Friday Initiative. All rights reserved. Science Friday transcripts are produced on a tight deadline by 3Play Media. Fidelity to the original aired/published audio or video file might vary, and text might be updated or amended in the future. For the authoritative record of Science Friday’s programming, please visit the original aired/published recording. For terms of use and more information, visit our policies pages at http://www.sciencefriday.com/about/policies/
Alexa Lim was a senior producer for Science Friday. Her favorite stories involve space, sound, and strange animal discoveries.