In their recent episode of the VALUE: After Hours Podcast, Jake Taylor, Bill Brewster, and Tobias Carlisle discussed Fingerprinting & Bias Blind Spot In Investing. Here’s an excerpt from the episode:
Jake: Yeah. It’s good to rub your nose in it. So, fingerprinting, which may seem a little odd to start out with, but let’s open the scene on a commuter train in Madrid, 2004, and all these explosions go off from a series of bombs that were planted. 192 people killed, 2,000 injured. Some fingerprints are found on a plastic bag at the crime scene, sent to Interpol. The FBI lab jumps into high gear, starts processing it. They find conclusively that the fingerprints belong to this guy named Brandon Mayfield. He’s a former officer in the army. He married an Egyptian wife and converted to Islam, and now he’s a lawyer that represents men who have been charged with attempting to travel to Afghanistan to join the Taliban.
FBI, they sit on– He’s been on the FBI watchlist for a while already. They bug his house, they tap his phones. They eventually arrest him, and they hold him for two weeks trying to figure out if it was him, but he’s never formally charged and they eventually let him go. During that two weeks that he was held, it turns out that the Spanish investigators find a matching print to another suspect that they have. So, how did the FBI arrest a guy that was living in Oregon, hadn’t left the country in a decade, for having matching fingerprints on bomb materials found in Madrid? Well–
Tobias: Can I have a guess? It’s his twin. It’s an identical twin.
Jake: [laughs] Yeah.
Tobias: He posted a bag. He posted something in the bag. I don’t know.
Jake: I wish that it was that much intrigue to it. This is more actually just purely human error.
Tobias: They mix it up in the lab?
Jake: No, not quite. But let’s rewind a little bit and go back to some of the history of fingerprinting, which as an identification technique was really formalized in the late 19th century by the Scottish physician named Henry Faulds. So, a little bit of terminology here. When they find fingerprints, they’re called latent fingerprints. They’re left at the scene, and oftentimes, they’re overlapping, they’re smudge, they’re only partial. So, there’s actually a fair amount of subjectivity to this exercise. Although when we think about it, we think, “Oh, God, this is hard science. We’re matching fingerprints. There’s never any mistakes here.”
Then when they actually collect fingerprints in a controlled setting, whether it’s ink or let’s say, you clear customs and you put your hand on the little scanner, those are called exemplar prints. So, they match the exemplar prints with latent prints to see, is this a definitive match and an identification? Is it a potential, or can we rule this out? Is it inconclusive, or is it definitely not that? It seems invaluable, and especially when we compare it to eyewitness testimony, which is the notoriously terrible.
So, this neuroscientist named [unintelligible [00:25:39] I think is how it’s pronounced. He wondered if there might be some noise in these judgments, because latent prints are, there’s some subjectivity to it. What he did was these fingerprint experts agreed to at some point in time in the next five years, be fed prints that they had already at one point identified to see do they even match with their own predictions of whether this is accurate or not, which is a good way to tell if there’s noise in a system as you get a second reading on this using the same person.
But he added a twist, and some of the people were given some information about this second set of prints like, “Suspect has an alibi,” or, “Maybe detectives believe that this person is guilty,” or, “This person confessed to the crime already.” So, they’re given some confirmation one way or the other at the beginning of the process, and it turns out that it hugely altered their decisions. The first study found that like four and five altered their decision based on this nudge from some initial confirmatory data. A second study found that 4 in 24 decisions were overturned, like basically contradicted. One in six is not a hard science necessarily as much as I think maybe we all believe that it is.
What’s going on there, and this is probably deserves its own veggie segment at some point, but there’s this predictive processing that happens in the brain where from the top down, you have a model of what you’re expecting to see, and the bottom up is your senses feeding up this information in the form of electrical signals, and those two things are being matched in a neural network that is your brain and then figuring out like, “Is there surprise or not? Does this model have what I think it’s going to be matched with what I’m being fed?” What ends up happening is that the examiners literally don’t see these little nuances in the latent prints, because the top down is not looking for them, because the top-down part of the brain that is filtering the model is already looking for the answer that it’s expecting. So, this is how confirmation messes with our actual– what we view of our sensory input.
In the Mayfield case, there were three experts who got this wrong. The first one who looked at it was really impressed by the power of the automated system that found correlations with his print and the latent print that was taken in Spain. So, there’s the first little red flag for us, is that computer systems if we believe in them too much might throw us off and make us to close our minds a little bit. The second person who looked at it knew the first person found it to be a match it was like a very well-respected supervisor. They were much less likely to come in and say, “No, this guy is wrong,” because it’s like their boss practically. Then, a third independent person was looking at it and was also given that it was confirmed and was just verifying it. So, you had three experts who all had too much information at the beginning of their assessment basically.
Now, lest you think that they don’t understand this, they do understand this, but we have a blind spot bias. So, 71% of the forensic experts agree that bias is a potential concern in their jobs. 26% of them believe their own judgment could be potentially influenced. So, we always think it’s everyone else. It’s not us. We’re the ones who are not biased by this stuff. So, really, what’s important is that the sequencing of information actually can have a huge impact on the premature intuitions that we can draw from that information. In the science, they call it linear sequential unmasking, so only showing data to someone when it’s the right time. If you have too much information before you actually go do your own work on it, that can totally mess up and bias how you interpret the data.
If you really want to have truly the wisdom of the crowds emerge and have a group work on a project and try to come up with the right decision, you have to be very careful about the order of information that those people receive. The second person really should not know what the first person found to be true because it will wildly impact how they will interpret the actual new data that they’re looking at.
Jake: You think about this in the investment context, if you work on in groups on investment ideas, and you start out with anything more than just the ticker name basically and you have someone gives you their pitch, their version of it, you are almost already compromised similar to how fingerprint, which I would say has much less smudge to it than the investment process potentially, and much less potential for noise to creep in. I think you have to be very, very careful about the sequential information that is delivered to you and when you actually do your own work and then compare notes with other people.
Yeah, that’s trying to take something from hard science a little bit of fingerprinting and showing that we have to be very careful how we structure our investment processes.
Tobias: This automated system, does it scan the fingerprint–
Jake: Against the database.
Tobias: Does somebody have to enter the fingerprint in by saying, “Here’s a whorl, here’s a whirl,” or whatever it is his whatever the other thing is?
Jake: Well I guess I don’t really know but I would imagine that it’s just like, “Here’s a picture of the fingerprint we got off the plastic bag, run it against the FBI database of known potential matches and then show me the ones who have the highest correlations,” or whatever.
Tobias: Because it’s potentially– if it’s a computer looking at it and a computer making its own decisions about what’s important and then comparing it to another, I would have thought that’s reasonably unbiased but if you have somebody entering it and they have to go and manually kind of identify all of the identifying features–
Jake: Like match these points kind of–
Tobias: Yeah. Then you have two that’d just be entered in randomly get the same answer or–
Jake: I don’t know of the exact procedure of how the computer works.
Tobias: Yeah, but you’re certainly biased by hearing this has been confirmed twice. We just need you to [crosstalk]
Jake: Yeah, just check this out, and you’re the independent third-party assessment.
Tobias: Yes, one of the biggest problems in investing. As soon as you hear somebody else’s pitch, it’s very hard. You get told a company is a great company all the time, very hard to go and look at that and say– It’s why I like spending a lot of time looking at what the shorts are doing because I think the shorts, at least they’re skeptical. They’ve got their own biases. There’s no question they have their own problems, but at least it’s a different perspective.
Jake: Yeah, that’s true.
Bill: I don’t know.
Tobias: [laughs] You don’t know? All right. Proceed.
Bill: I don’t know that I buy that shorts are less skeptic. I don’t know, less bias. [crosstalk]
Tobias: They are looking at it from different perspective. They can be biased [crosstalk]
Bill: I guess, man. I’ve read short reports that I think are garbage.
Bill: [crosstalk]. Yeah. I guess it depends who. [crosstalk]
Tobias: If you had a team of analysts and you have identified this idea, you guys go and tell me why it’s a good idea. You guys go tell me why it’s a bad idea.
Jake: Yeah, red team. They call that.
Tobias: Your confirmation bias is to find the bad stuff.
Tobias: That’s what helpful.
Bill: Yeah, I’ll buy that.
Jake: But even then, you have to be very careful about how much information each person has before starting their independent assessment.
Bill: Here’s my beef. Fundamentally, I just don’t really believe that you should do work in a vacuum. At my core, I don’t think anybody that’s great has sat behind a desk, not talked to anybody, and figured stuff out. Whether it’s Buffett going to Geico, I don’t think Buffett goes to Geico if it’s not for Graham. If it’s Buffett hiring investigators to go out and find things out about people, I don’t think that I believe in doing things alone.
Jake: No, it doesn’t have to be alone.
Bill: I know. The source of the information, how you find it, theoretically, I should start to A and go to Z, and only read 10-Ks, and peel back the onion in a way. I just don’t know that I buy it. That’s fine.
You can find out more about the VALUE: After Hours Podcast here – VALUE: After Hours Podcast. You can also listen to the podcast on your favorite podcast platforms here:
For all the latest news and podcasts, join our free newsletter here.
Don’t forget to check out our FREE Large Cap 1000 – Stock Screener, here at The Acquirer’s Multiple: