The Math Behind Big Decision Making
23:23 minutes
What does it mean for your health if a cancer screening is 90% accurate? Or when a lawyer says there’s a 99% chance a defendant is guilty? We encounter numbers in our everyday lives that can influence how we make big decisions, but what do these numbers really tell us?
In his book The Math of Life and Death: 7 Mathematical Principles That Shape Our Lives, mathematical biologist Kit Yates says “mathematics, at its most fundamental, is pattern. Every time you look at the world you are building your own model of the patterns you observe… With every new experience, every piece of sensory information, the models you’ve made of your environment are refined, reconfigured, and rendered ever more detailed and complex.”
He joins Ira to talk about the hidden math principles that are used in medicine, law, and in the media and how the numbers can be misused and correctly interpreted.
Read an excerpt of Yates’ new book!
Invest in quality science journalism by making a donation to Science Friday.
Kit Yates is author of The Math of Life and Death: 7 Mathematical Principles That Shape Our Lives (Scribner, 2020), a senior lecturer in Mathematical Science, and co-director of the Center for Mathematical Biology at the University of Bath in Bath, England.
IRA FLATOW: This is Science Friday. I’m Ira Flatow. You’ve probably seen it on one of your favorite crime TV dramas. There’s a murder case. A single drop of blood from the accused defendant is found on the scene of the crime. As this blood type is only found in 10% of the population, the prosecutor says there is then a 90% chance that the defendant is guilty. But do these numbers add up?
That example– it’s not just used on TV juries. It can be found used in real court cases. There are all sorts of numbers and stats thrown at us all the time– probabilities, accuracies, how well a drug will work, and algorithms that are used to determine the news we read, to Wall Street trading all kinds of numbers.
My next guest is here to talk about mathematical misreads and misdirections, and how to interpret the story behind all of the numbers. Kit Yates is a senior lecturer in mathematical science at the University of Bath in England. His new book is The Math of Life and Death– Seven Mathematical Principles That Shape Our Lives.
You can read an excerpt of his book on our website at sciencefriday.com/everydaymath. And if you have a question about any stats or math in your everyday life that don’t seem to add up, or how to interpret them, give us a call– 1-844-724-8255, 844-SCI-TALK– or tweet us at @scifri. Welcome to the program, Dr. Yates.
KIT YATES: Hi, Ira nice to be on. Thanks for having me.
IRA FLATOW: You’re a mathematical biologist, where you study topics like egg patterning. What is mathematics biology?
KIT YATES: Right. It’s a crazy one that most people haven’t heard of, I guess. And I think people find it hard to think that maths and biology can be married up together because, I think, at school, we’re taught that math is really pure and abstract and hard, whereas biology is really messy and real-world, and never the twain shall meet. But actually, when I went to university, I did this amazing course in mathematical biology, and I found out that maths can be used to describe the world around us. It can be used to describe engineering and science and physics, but also biology as well.
And so what I do in my day job is to try to take biological systems that we’re interested in– so maybe plagues of locusts or, as you mentioned, egg patterns. Or, for me, I’m particularly interested in developmental biology– so the way the embryo forms, and what can happen if something goes wrong there. And we try to represent that system using a series of equations or a computer code so that we can do some mathematical experiments which may be unethical to do in an animal, or may be just too difficult or too expensive to do. So we can actually go ahead and do those experiments in the computer, and we can learn something about the system.
IRA FLATOW: Your book goes way beyond biology. And in fact, it talks about how, when we are presented with numbers, that most people just take them at face value. It’s almost as if we have never been educated enough ourselves how to judge the value of them.
KIT YATES: Right. Exactly. This is one of the main messages I want to come out of the book, is that people will use numbers against us. They will manipulate us with statistics. Politicians, newspapers will throw numbers at us. And I think we’re bit too scared to question these people.
What I’m not saying in the book is that you have to be a mathematician. You have to go into a degree in maths. That’s absolutely not the case. What you should feel free to do is to start to question these people who are wielding the numbers and who are manipulating us with statistics, and to say to them, what does this statistic actually mean? How did you calculate this? Is it the real deal? We should start to fact-check people and to call people out on their numbers.
IRA FLATOW: Yeah. Talk to me about what you think is the most misinterpreted fact or number that gets thrown about– something that– health statistics, for example.
KIT YATES: Yeah. I think in newspapers, it’s really common for them to want to ramp up the probability of getting a disease, how likely is that a particular lifestyle choice might impact on our lives. So I read a story a few years ago in The Sun newspaper in the UK which said that eating a bacon sandwich every day increases the risk of colorectal cancer by 20%. I read this headline and was like, why? Could it possibly be the case that people that don’t eat bacon sandwiches every day, they have maybe a 5% background risk. But for people that do, they have a 25% risk of getting colorectal cancer?
And I think– when I dug into the story, it turns out that the real statistic is that of 100 people who don’t eat a bacon sandwich every day, five of them will get colorectal cancer over the course of their lifetime. And of people that do eat a bacon sandwich, only six people will get colorectal cancer over the course of their lifetime– the six people that do eat a bacon sandwich. And so that’s an absolute increased risk of 1%. And that would be the honest way to present it, is to say, without the treatment– without the bacon sandwich, it’s 5%, and with the bacon sandwich, it’s 6%.
But actually, what The Sun had done and said, well, actually, 1% represents a 20% increase on 5%, and so we’re going to sell this as a 20% increase in risk. This is called the relative risk. So if you’re presented in a newspaper article or a study with just a single, big percentage figure, it’s likely that they’re giving you the relative risk. And what you really need to dig down and find is the absolute risks, which will usually be, too, much smaller numbers with and without the treatment for a disease, or with or without a particular lifestyle choice. That’s one that comes up all the time.
IRA FLATOW: I’ve seen that apply to so many medical studies about that risk between one or two occurrences, for everything from statins to all kinds of other things, where people are just– they just don’t know how to read it correctly.
KIT YATES: Right. And the mad thing is, it’s not just newspapers that are doing this. Apparently, in about a third of top scientific papers that were surveyed in the study, they found people doing this thing called mismatched framing– so presenting the benefits of their drug using a big figure, the relative risk as a percentage to make it look good, and then presenting the side effects of their drug using these much smaller absolute risks, and not using a percentage– using a decimal, so that it looks even smaller. So yeah, it’s not just newspapers, but even some scientific papers, this is happening as well. So yeah, we’ve got to be super aware of this.
IRA FLATOW: So the results can be accurate and yet imprecise at the same time.
KIT YATES: Yeah. Exactly, yeah. I think this is something that we’re also struggling to deal with when we come to things like going for screening. There’s a toy problem in the book, and this is a question which was set to German doctors. And it’s about going for screening for breast cancer. So these figures are for the UK, but they’re similar for the US. The probability that a woman who’s over 50 has undiagnosed breast cancer is about 0.4%. Such, four in 1,000 women will have undiagnosed breast cancer who go to these screens. And then if a woman has breast cancer, the probability that she tests positive with this test is 90%. So sounds pretty accurate. And if she doesn’t have breast cancer, the probability that she is correctly told she doesn’t have breast cancer is also 90%.
And then the question that was asked to these doctors was, what’s the probability that, if you’re a woman who’s gone to a screen and you get a positive mammogram result, you’re told you need to go back for further tests, what’s the probability that you actually have breast cancer? And so they were given these five options. A was 90%, 81%, 50%, 3.5%, and 0.4%. And they were asked to answer this. These are the people who are supposed to be able to interpret these results for us, the doctors. And actually, most of them got the answer wrong. And that’s– I think it’s a really surprising answer. Indeed, when I tried the question myself, I got it wrong.
If you actually dig down into the math, you find that the probability of actually having disease if you get a positive mammogram is only 3 and 1/2%, which is crazy. It’s crazy small. And this is a problem that we face with screening, that the vast majority of people who go for a screen don’t have the disease, and so that means we’re testing a lot of people who don’t have the disease. So when there’s a test which has maybe even– it sounds quite high, 90% accuracy rate, but if 10% of the people who don’t have the disease– which is, again, the vast majority of people– are told that they do have the disease, that’s a huge number of false positives in comparison to a relatively small number of true positives. It explains why these false positives can dramatically outweigh the true positives.
I need to be careful here to say, firstly, I’m not bashing doctors. I think they do an incredibly difficult job, and to expect them to be on top of all the numbers is difficult. And secondly, that I’m not advocating not going for screening. Don’t stop going for screening. But what I am saying is, take the results of screening with a pinch of salt, and be aware if you get a positive result, that it’s not necessarily the end of the world. I tell a couple of stories in the book about people who’ve made dramatic choices because they’ve got these letters telling them that they have to go back for further testing, and they fretted and worried and really stressed about this. And when they go back for the test, it turns out that it was a false positive, as it was always likely to be. So taking these results with a pinch of salt. You can get tests which seem accurate, but are actually quite imprecise.
IRA FLATOW: Our number, 844-724-8255, lots of interest. Let’s go to Naples, Florida. Sunny Naples. Ted. Hi. Welcome to Science Friday.
TED: Hi. How are you? Hi there. Go ahead. I love NPR and I love Science Friday. Your name is Neil, is that right?
IRA FLATOW: No, it’s Ira. Neil hasn’t been around for years.
TED: I know. I’m embarrassed.
IRA FLATOW: Do you have a question, now that you can unembarrass yourself?
TED: Yeah. Well, I have a comment. In 1982, I was identified by a urinalysis test in the army overseas, and 10 years later, I was ordered reinstated by a federal judge because they had a big blue ribbon panel, Surgeon General’s review what they’re going to do with these thousands of false positive cases. And I was one of them, and 10 years later, I was ordered reinstated. Lawyers would not help me, because there’s no money and there’s no damages you can get from a military enlisted person’s drug case.
IRA FLATOW: How do you react to that, Kit?
KIT YATES: Right. Obviously, I can’t comment on the case itself. I don’t know the details. But yeah, interestingly, with these tests, the reason why, for example, athletes get drug tested– when they get drug tested, their sample is split into an A and a B sample, so that if something goes wrong with the first one, they can test the second one. And it’s really interesting fact that, if you just run a second version of the same test on the same person– assuming the results are independent of each other, you run a second version of the same test, you can dramatically improve the precision of that test. You can dramatically weed out the number of false positives that you get, so that you don’t get the same sort of problem of labeling people incorrectly.
So what– another message in the book is to ask for a second opinion, both ask for– ask your doctor to tell you where the figures come from and to explain them to you, but also, running a second test can dramatically reduce the rate false positives. And that’s why athletes do have these A and B samples, so that they can be exonerated if something goes wrong with the first test. So yeah, that’s a good strategy.
IRA FLATOW: Let’s go let’s go to Roger in Missouri. Hi, Roger.
ROGER: Hello.
IRA FLATOW: Hi there. Go ahead.
ROGER: One quick question, and then a real question. Can you predict mathematically the odds that a politician is lying? That’s number one. But the real question is, the idea of this DNA test where they tell your ethnicity– like, oh, you’re 14% Finnish. You’re 3% Cherokee. Isn’t that dependent upon the sample size of the global sample? Are they messing with our numbers by just simply declaring your ethnicity through some percentage?
KIT YATES: Right. So the basic assumption, in terms of politicians, is to assume that 100% of things they say are lies. But seriously, no. There’s no accurate way to determine that. But we do have really good fact checkers now, and that’s something that we need to be doing more of, is actually, after there’s been a– we just had a general election here in the UK. And after the debates between the leaders, various websites were fact-checking what they had said. And we need to not just watch the debate, but we need to watch the follow-up and see what percentage of the things they were saying are true or false, and really read up on these fact checks.
IRA FLATOW: Before you head into the second question, let me just remind everybody who we are. This is Science Friday from WNYC Studios. And I get a little plug in for your book at the same time.
KIT YATES: Shoot. Go for it, mate.
IRA FLATOW: Talking with Kit Yates, author of The Math of Life and Death– Seven Mathematical Principles That Shape Our Lives. And I love this kind of stuff because we deal with it every day on Science Friday, how to understand the math of the stuff that we’re reading. And the second part of our listener’s question was about these genetic testing services. And in your book, you didn’t have a very good experience with them, did you?
KIT YATES: Right. Yeah, I actually decided that it would be a fun thing to do to send off one of these spit kits somewhere to 23andMe, and got my DNA profiled. And it came back, and it said that I had a genetic mutation in a particular gene, the APO-E gene– which stands for apolipoprotein E, if you want to know that– which basically said that I have an increased chance of getting Alzheimer’s. And this worried me quite a lot.
And so I decided that I wanted to figure out exactly what these figures meant. So I went and looked at the math that they use, and actually, it turns out that there’s been a study done that looks at the accuracy of the way that these companies are calculating our disease risks. And it turns out that different companies, based on exactly the same genetic profile, will classify you into different risk categories because they’re using slightly different mathematical formulas to calculate the risk. So my experience was that, actually, I’m not going to get too worried about this genetic mutation I have, because actually, I’m not sure that I trust necessarily the maths.
In terms of the background and your genetic makeup, where you come from in the world, I also wouldn’t read too much into those. I think the bigger databases get, the more accurate they can be. But at the same time, I wouldn’t take them 100%. I wouldn’t believe them 100%.
IRA FLATOW: You don’t think that people are intentionally misdirecting our numbers intentionally to fool us? Or maybe they don’t understand the mathematics themselves, which would be worse, wouldn’t it?
KIT YATES: Right. I think it’s a bit of both, actually. I think sometimes it’s conspiracy, and sometimes it’s mess-up. But it depends who you’re talking about. I think lots of people are genuinely trying to do their best and just get the maths wrong, but I think there are definitely people out there who are manipulating numbers– newspapers, for example, the example I’ve already given. I guess newspapers want to sell copies of their paper. They want to drive traffic to their website. And so the more sensational they can make a statistic, the better.
Similarly, politicians have got a vested interest in furthering an agenda. And so if they can tweak the numbers to make their agenda look better, then they’ll absolutely do that. And be aware that there isn’t going to be much comeuppance for them. There are no– there really are very few slaps on the wrist of the politicians who deliberately mislead us with numbers.
IRA FLATOW: Do you think that the pollsters are accurate these days?
KIT YATES: Yes and no. I think people like Nate Silver are doing a really good job on FiveThirtyEight, where they’re taking a whole group of polls and they’re giving them different weights and averaging over them. But polling is really, really difficult to do. Predicting the future is a really hard thing.
There’s a nice example in the book of the Literary Digest when they were predicting an election over in the United States. It was Roosevelt versus Alf Landon back in 1936, I think it was. And they went out and they polled 10 million people. They polled a quarter of the electorate in the United States at the time. And they predicted a massive landslide for Landon, and it turned out that Roosevelt won by the biggest majority for hundreds of years since 1820, I think.
And the reason they got it wrong is because they had a biased sample. They’d chosen the people to sample from a list of people who had telephones and from people who could read and write really well. And what they found was that they got people who were typically more affluent, and therefore more right-leaning, and who’d gone for the Republican candidate Landon and hadn’t gone for Roosevelt. So they got the result dramatically wrong despite having this huge sample. So predicting the future is super, super hard.
IRA FLATOW: Well, my future here is a station break. So we have to do that, Kit. Stay with us. We’re going to come back more and talk with Kit Yates, author of Life and Death– Seven Mathematical Principles That Shape Our Lives. We’ll be right back after this break.
This is Science Friday. I’m Ira Flatow, talking with Kit Yates, author of the book The Math of Life and Death– Seven Mathematical Principles That Shape Our Lives. And we’ve been talking about ways that math has been used for good and for bad, and where people make mistakes with math. And I think what concerns a lot of people when mistakes happen with math– they’re really worried when it happens in the court system, Kit. Tell us about this interesting case named Sally Yates in your book.
KIT YATES: Yeah. Sally Clark. Yeah. She was–
IRA FLATOW: Oh, it’s Kit Yates. I’m sorry. Sally Clark.
KIT YATES: Don’t worry. That’s fine. Yeah, Sally Clark. She’s no relation of mine. But yeah, so Sally Clark’s case is often called one of the worst miscarriages of justice in UK legal history. She was a mother of two children. She– unfortunately, the first child she had died within about six weeks of being born. And then she tried for a second child with her husband, and that child also died. And because those two children died so early on, the police got a bit suspicious, and they arrested both Sally and Steve Clark, her husband. But because Steve wasn’t there for the second death, he was let go. But Sally was prosecuted for these murders.
And when she came to trial, there was an expert witness called by the prosecution, a guy called Sir Roy Meadow. He came up with a statistic, which was something that the jurors took away with him is probably the most important piece of evidence. And he basically said that, if Sally Clark was innocent, the probability that her two children died of sudden infant death syndrome– which is cot death, so a possible alternative explanation– was as low as 1 in 73 million, and basically left the rest of the jury to assume that the probability of her then being guilty was therefore extremely high.
And what happened, though, was that he’d actually made a few mathematical mistakes. One of them was called an independence mistake. So he’d taken the figure for the probability of having one child die of sudden infant death syndrome– the innocent explanation for her children’s deaths– and that was one in 8,000 for a family like the Clarks, who were middle class and affluent. And then he’d said, well, if that’s the figure for one child dying, then the figure for two children dying, I must just multiply that two together by itself, or square it. And he came up with this figure of 1 in 73 million.
But of course, that makes the assumption that two children dying of sudden infant death syndrome are independent of each other. And actually, they’re not, because there are a variety of factors which mean, once you’ve had one child die of sudden infant death syndrome, it’s dramatically more likely to have a second child die from the same condition. Things like if you smoke, if you share a bed with your children, there are genetic factors that are linked to sudden infant death syndrome. So he’d made this mistake of assuming that they’re independent, and he’d come with a probability of this innocent explanation of being far lower than it should be. So that was one of the most significant mistakes that he made.
But one of the other ones is actually so common in courtrooms that it has its own name. It’s called the prosecutor’s fallacy. The idea is that it starts by saying, if the suspect is innocent, seeing a particular piece of evidence is extremely unlikely. So if Sally was innocent of killing her two children, them dying of sudden infant death syndrome is extremely unlikely. That was Meadow’s premise. And actually, the prosecutor then deduces incorrectly that the alternative explanation, which is the guilt of the suspect, is therefore extremely indeed.
But what the argument neglects to take into account is any possible alternative explanations in which the suspect is innocent– so maybe Sally Clark’s children dying of natural causes, for example, are not taken into account– and also neglects the possibility that the explanation the prosecution is proposing, which is murder, is actually just as unlikely, if not more so. The frequency of double murders is far lower than the frequency of double sudden infant death syndrome. So when you weigh those up, it paints a very different picture of the probability of Sally Clark’s guilt. But the jury would just led to believe that the probability of her being innocent was as low as 1 in 73 million, which wasn’t the case.
IRA FLATOW: And that’s the problem with having what you call a binary answer to things, either white or black, yes or no, guilty and not guilty.
KIT YATES: Right. I think so. Binary is obviously the system that we use in our computers, and it’s great for computers because they work on binary logic. You can run a little current through a transistor, and it can give you a yes or no answer. But actually, when it comes to human affairs, binary answers are not that useful. Humans aren’t black or white, although some of our favorite characters– we like to have goodies and baddies. But actually, some of our favorite literary characters are actually morally ambiguous– the people like Severus Snape or Hamlet, people who are both good and bad. And everyone has a little bit of that in them. So yeah, trying to characterize people is as good or bad or one thing or the other is not particularly helpful. So binary isn’t a particularly good number system for us to use in terms of human affairs.
IRA FLATOW: There are all kinds of great number of systems in Kit Yates’ book. I didn’t even get into base 12 that you talk about. All kinds of really interesting, great stuff in this book, The Map of Life and Death– Seven Mathematical Principles That Shape Our Lives. We have an excerpt on our website at sciencefriday.com/everydaymath. Kit, thank you for taking time to be with us today. Great book.
KIT YATES: Oh, it’s been absolutely a pleasure. Thanks for having me.
IRA FLATOW: You’re welcome.
Copyright © 2020 Science Friday Initiative. All rights reserved. Science Friday transcripts are produced on a tight deadline by 3Play Media. Fidelity to the original aired/published audio or video file might vary, and text might be updated or amended in the future. For the authoritative record of Science Friday’s programming, please visit the original aired/published recording. For terms of use and more information, visit our policies pages at http://www.sciencefriday.com/about/policies/
Alexa Lim was a senior producer for Science Friday. Her favorite stories involve space, sound, and strange animal discoveries.
Ira Flatow is the founder and host of Science Friday. His green thumb has revived many an office plant at death’s door.