During their recent episode of the VALUE: After Hours Podcast, Taylor, Brewster, and Carlisle discussed Lessons From Nuclear Meltdowns. Here’s an excerpt from the episode:
Jake: Okay. So, this is lessons from nuclear disasters. And as you guys know, my background before was running the power grid. So, I have a lot of fun doing these kind of little research projects. I feel like I’m bringing my worlds together. So, this is based on– a lot of it on this book, really good book called Meltdown. That’s by Chris Clearfield and András Tilcsik, I’m sure I’m saying that wrong, I apologize. Which came as an original book recommendation from Caffeinated Investors. So, shout out to him for this. Thank you. So, 19– [freeze]
Tobias: Into the Jaketrix.
Bill: [crosstalk] Jaketrix.
Tobias: He’s back.
Jake: Oh, sorry.
Tobias: Just from the date.
Bill: [crosstalk] talking about nuclear energy.
Jake: I didn’t say anything after that. So, it’s 1979, Harrisburg, Pennsylvania, and this little thing called Three Mile Island, you guys might have heard of it. So, it started out as just a simple plumbing accident, nothing special about it. And it turned into a huge disaster, but the causes of it were very trivial. There’s no earthquake or an engineering mistake. It was a combination of small failures. So, you had this little plumbing glitch, you had a failure of pumps to send water to the steam generator, which had increased the pressure in the reactor, which then– there’s an opening of a pressure relief valve and it failed to close. And then, there was a misleading indicator to the operators that there was a stuck– the valve’s position was stuck open.
Tobias: So, it’s a sequence of– you need this very precise sequence of things to go wrong. But it is possible for all those things to go. They’re low probability events play but they all go wrong at the same time?
Jake: That’s right. All of that happened in the amount of time that it took me to read that. It was 13 seconds.
Jake: Right. In less than 10 minutes, all of the damage to the reactor core was already done. The failure was driven by the connection between the pieces as much as the pieces themselves. And a cup full of nonradioactive water led to the release of 1000 liters of radioactive coolant, which ended up– people end up getting cancer, there’s a whole long tail of problems from this. This researcher named Charles Perrow studied it, and he calls these normal accidents.
What he means by that is that they are bound to happen because they’re these little small things that are going to happen and when we increase the coupling with the tightness of the coupling in our systems, we end up with a cascade of failures because of that. So, the complexity increases the chances that it’ll happen and then the coupling increases the chances of cascading errors after that.
And so, he draws this axis. On one axis is increasing complexity. And then, the other way is tighter coupling. And up in the upper right-hand quadrant where you have very tight coupling and high complexity is where you have the biggest meltdowns. And Perrow said that the financial system exceeds the complexity of any nuclear plant that he’s ever seen. We have a very complex system that is increasing in complexity along with a tighter and tighter coupled system.
What’s difficult is that you can’t see necessarily– you can’t find all the problems just by thinking about them because of this complexity. There are interactions that are so weird in complex systems that you’ll never be able to predict them. Then, you can’t predict the chain of errors that’s going to happen because of the tight coupling. Those two parameters, you get to thinking about how much are we cording this with just-in-time supply chains, the increased tighter and tighter coupling?
We have the internet of things where now everything is increasingly interconnected, which is a tighter coupling. Algos for everything, which I think adds a layer of complexity because one of the things of–transparency is a great antidote to complexity. When you can’t actually see the system that you’re looking at and get a status on it, it becomes much harder to diagnose problems. When everything is a black box in an algorithm, we don’t know why it’s doing anything, we’re just really increasing the complexity of everything in the world.
Bill: Can I just real quick, please, just to take the devil’s advocate part of that? There’s an argument to be made that through a lot of these social media platforms or something, you’re actually reducing the black box and you’re pulling back the curtain now. It’s got its other problems, I think that is what some people would say. So, I figured I’d bring it up. Also, do you even [unintelligible [00:35:21], bro?[laughter]
Jake: I would say in that social media instance, perhaps you are maybe adding transparency, which might reduce complexity, but you’re also greatly increasing coupling. Systems are tied more and more, people are tied more and more together. Ideas can transmit faster through that network now.
Bill: The groupthink of it scares me all, but that’s a separate issue.
Jake: Yeah, and I think one of the things too they found is that homogeneity, it will greatly increase coupling.
Bill: Huh. Yeah, that makes sense.
Jake: [crosstalk] –has a very homogenous idea, increases the coupling of our markets.
Bill: Dude, if indexing just ends up really bad, can we just all agree that– I don’t know how it went bad, but if everybody is just putting their money into a vehicle that buy stuff at higher and higher prices and no one is paying any attention, if that ends up bad, it’s not that unforeseeable somehow.
Tobias: But that’ll be the best thing that happens to the market. That’ll be the best thing that ever happens to us. Fundamental value guys.
Bill: Well, dude, people will get destroyed. [crosstalk]
Tobias: Yes, that part of it.
Bill: I’d feel bad for the people that were told this is the right thing to do. And then, once again, they feel screwed.
Tobias: Yeah, well, that’s fair.
Jake: So, that’s another thing, is that trust in systems creates tighter coupling because people stop doing their own work and their own calculations, and they just trust that the system is doing what it’s supposed to be doing.
Tobias: I do have some questions.
Jake: We have another meltdown to go to. Japan’s Northeast coast, and there’s this tiny village called Aneyoshi, I believe, and in that village, there’s this stone tablet. And on it, it says, “Dwellings built on high ground will ensure the peace and happiness of our descendants. Remember, the calamity of the great tsunami. Do not build homes below this point.” And it’s from the 1930s. All along the coastline in different towns, there are these stone tablets basically in the ground that say, “Don’t build below this point.”
This was where the water came up to in 1870, or whatever it was. Of course, with time, people sort of forget about this, they haven’t seen water there, and what do they do? They start building down in the lower areas, and they forget– they have to go relearn the lessons. But, of course, 2011 and the Tohoku earthquake, as a 9.0 earthquake off the coast of Japan, creates a tsunami and that leads to the Fukushima meltdown disaster.
What can we think about like– there’s a lot of old lessons and rules of thumb that might be we could consider the tablets or the stones put in the ground. So, you think about somebody like Walter Schloss. He would probably have a rule that said, like, “I don’t pay more than 10 times P/E. And it doesn’t matter what other good things are happening with this business, but I’m just never going to pay more than 10 times P/E.”
He doesn’t go down into the valley any lower than this point. And there’s lots of time periods where everyone is– you’ve built your house there and it’s no problem and you’re getting by with it, no problem. But then, every once in a while, there’s this long tail event that happens that will completely destroy your house if you don’t pay attention to the longer term, the tough lessons that your ancestors learned and tried to warn you about. And I think we ignore a lot of that especially in times like this where you buy things that feel good, you’re not as conscious about price. I would just say maybe look for your–
Bill: I thought you were going to talk about value managers, not buying compounders in 2015. Sowing the five years of pain.
Jake: I guess the answer is that you need to find your own sort of tablets and put them in the ground for yourself so that you don’t go building your house in places that have historically been wiped out in previous tsunamis. When it doesn’t go well for a lot of these people, there’s not going to be any– you could have seen it coming. That is the real answer. If you did little bit of work on history, you could see it coming. If you overpay for things, it typically eventually will not work out for you. So, anyway, that’s what I got for today.
Bill: The people I worry about the most are the people that are coming into Twitter, seeing the accounts that have done really well, and thinking that’s the strategy that they should maybe adopt without– Like I said, I actually think David Gardner’s approach is pretty friggin’ smart. I get it. But you’ve got to be him to implement it and he implemented early. He doesn’t implement it at the end. That’s the part that I worry about some people, but everybody’s got to get intuition somehow.
You can find out more about the VALUE: After Hours Podcast here – VALUE: After Hours Podcast. You can also listen to the podcast on your favorite podcast platforms here:
For all the latest news and podcasts, join our free newsletter here.
Don’t forget to check out our FREE Large Cap 1000 – Stock Screener, here at The Acquirer’s Multiple: