That and the book contains a lot of notes. Apparently. That in and of itself isn't necessarily a bad thing. But, as I read it on my Kindle, I have the little % completed bar running along the bottom. I was at 52% when I read "Let us tell you one more story..." and I thought "Shit, there is no way this last story takes up the other 48%." Surprise, it only took up another 5%. That is 43% notes, acknowledgement, and footnotes. So yeah. Made me happy I bought this on sale.
Let's focus on the good things. Because I did like the book and I did just pick up When to Rob a Bank by the duo (on sale, of course).
Unlike the previous two books, which focused on a number of relationships between seemingly unrelated topics (sumo wrestlers and Chicago school teachers, for example). This time around the book is more of a how-to guide, explaining how to approach problems "like a freak" to get to the bottom of these complicated questions. As with the other books, the main thesis is that you shouldn't trust conventional wisdom**, just because it's conventional and that everyone responds to incentives so figuring out those incentives can help solve (or at least identify) the problem. And they do still tie together seemingly unrelated people; one section is titled "What Do King Solomon and David Lee Roth Have in Common?"
One of the parts that stuck with me the most came from their chapter about quitting (and why it can be GREAT!) about performing premortems.
Many institutions already conduct a postmortem on failed projects, hoping to learn exactly what killed the patient. A premortem tries to find out what might go wrong before it's too late.The idea of listing out all the ways a person (or a project or an idea) could fail sounds like a bad idea. You could talk yourself out of it and then what? Better to focus on what will go right. And while I agree you could talk yourself out of whatever it is, I still think the benefits could outweigh the costs because you plan for possible scenarios in which everything could go to shit. Because if you go through those, if anything bad does happen, you already know what to do. Or you can actually set up the systems in place so the bad stuff doesn't have a chance to happen. Yes this can take time and make people uncomfortable, but given the example they use, seems like it's more than a fair trade-off.
This section talks about the Challenger disaster. Essentially, a group of experts had already suggested that the O-rings would fail, which was exactly what caused the explosion. In this case, the decision was made to go forward, which ended up being a tragic decision, but that's what lead to this group decided to start planning these "premortems" and hopefully actually listening to the results.
Overall I recommend the book. Or the podcasts. Or both. Just know that you'll be hearing a lot of the same stuff. Good stuff, but the same regardless.
**If you're looking for other "Everything you know is wrong" type stuff, might I suggest Adam Ruins Everything, which is one of my new favorite things.
Title quote from page 51, location 797
Dubner, Stephen and Steven Levitt. Think Like A Freak: The Authors of Freakonomics Offer to Retrain Your Brain. HarperCollins, 2014. Kindle