After you make an error, you should investigate what caused the error, and what changes you should make to prevent other errors due to the same underlying cause. This is called a postmortem (the root words mean “after death”).
Sometimes postmortems are quick and easy. That’s fine. They don’t have to be a big deal. Other times, the cause of an error is hard to understand or fix.
The errors we find out about are evidence about the problems in our heads. We all have important thinking problems that we aren’t aware of. Every time we discover an error, there’s a chance that investigating it will help us understand an important thinking error.
If you just make a straightforward correction – correct the error you made in this one case, but not in any broader category or pattern of cases – then you’re missing out on a potential opportunity for significant improvement. If you rarely or never postmortem your errors, then you shouldn’t expect to make much intellectual progress. Investigating the causes of errors is one of the main ways to become a better thinker.
Many errors come from some kind of thinking process, method, policy, habit or pattern which you reuse repeatedly. And we’re unaware of the majority of our errors. We get a limited amount of data – in the form of known errors where we actually learn we were wrong – to help us figure out underlying errors in how we think which cause many mistakes. These data points are valuable. We should give them attention instead of trying to deal with the error and move on as fast as possible.
What if you find out about few errors? What if you don’t get many chances to do postmortems? Then you should put extra effort into the postmortem opportunities you do get. You should also consider spending more time critically reviewing your ideas and seeking out more criticism. What could you do to expose your ideas to criticism more? There are options that can help, like writing a blog or participating at a discussion forum. You could also reconsider some of the things you spend time on that don’t provide good critical feedback.
What if you find out about many errors and don’t have time to postmortem them all? That is a sign that you have some flawed underlying thought processes that keep producing an ongoing stream of errors. You don’t need to postmortem everything. For example, it’d be reasonable to work on one postmortem at a time, and start a new one as soon as it ended. In this case, you should get involved in fewer discussions and share fewer ideas for feedback – you don’t need more criticism, so don’t ask for it. Sharing ideas from your postmortem investigation for critical feedback could be useful, but you don’t really need other criticism if you already have plenty of errors to investigate.
It helps to avoid having a large recency bias. If you find out about five errors and only investigate one, you could go back to one of the other errors afterwards. You don’t have to focus only on new or recent errors. You don’t really need more criticism if you set aside some criticism from a week or month ago. Even criticism from years ago can still be relevant if you haven’t made a bunch of relevant changes since then.
If you make improvements due to your postmortem investigations, that will often address multiple pieces of criticism. When you fix an underlying issue that was causing many errors, then you’re dealing with many past criticisms. Reviewing past criticisms that you hadn’t gotten to can help you determine if you’re making effective changes. If you recognize them as part of a pattern of errors that you were making, but have fixed, then that’s a good sign. When this is working well, you should see errors fitting the pattern in your past but not in your present.
Biases are one type of example. Suppose you make a mistake due to a bias, and then you fix that one mistake while making no changes to the bias. That’s not very good, is it? The same bias will keep creating more mistakes. If you keep fixing them one by one, you’re never going to win. The bias can cause an unlimited number of mistakes. People also have other underlying problems, besides biases, such as conceptual misconceptions, that can cause many mistakes over time.
A more specific example would be having a political bias, then reading news stories from your side and believing them even when they’re incorrect. Both main US political sides have information sources that put out many incorrect pieces of information. Suppose you keep doing this and then being corrected on each piece of misinformation individually (or more realistically, on fewer than ten percent of them – you should only expect partial visibility into the pattern of errors). Is that good? Are you correcting your errors since, when someone links you to a persuasive fact check, you change your mind about that specific case? Or is it important to look into the underlying issue?
Most important error patterns are much more subtle than this, so data points about them have higher value. When you can’t quickly guess what kind of underlying cause is behind an error, it has a better potential to be caused by something important that isn’t so well known.
It’s also important to have high standards. People often do a superficial postmortem investigation and then come up with some kind of explanation of the error, and then want to move on. Their explanation is usually either a well known idea or else a denial that there’s anything important going on – a claim that they made a random error that isn’t due to some meaningful pattern. You should aim to actually understand what’s going on in more depth and fix it in a lasting, effective way. This can be hard and take a long time, but it’s efficient overall. Keep doing it and you’ll be better at thinking – it’ll save time and make you more successful in the long run. Not fixing your errors in a serious way means you keep making them, which takes up a lot of time and energy, in addition to causing failures at goals you have.
Some people are too eager to update and move on without substantial investigation or discussion of their error. They just “fix” it in one minute and then are done. These fixes are usually errors too.
Mainstream culture is pretty hostile to errors. Making a mistake is seen as weak and bad. Admitting you made a mistake is seen as extra weak and bad. Admissions are fairly uncommon and indicate your mistake was so bad that you couldn’t keep hiding or denying it (as people usually do with most of their errors). These attitudes are irrational.
Some rationality-oriented groups, like Effective Altruism, have a better attitude to errors. Admitting you were wrong is sometimes OK and isn’t punished so much on average. Sometimes it’s praised and people who concede errors actually come out ahead socially. Similarly, Karl Popper fans talk about letting your ideas die in your place. Don’t be too attached to your ideas or make them part of your identity. You can decide they’re wrong without feeling personally threatened. Ideas can be rejected without rejecting your core self. You can focus more on being good at error correction than on believing you’re already right about other issues.
These attitudes help push back against irrational, anti-fallibilist mainstream hostility to mistakes. But they don’t encourage postmorteming. They don’t say that many mistakes are part of an underlying pattern of mistakes, and we should be trying to figure out and fix those underlying patterns, and each data point we get can help us identify a pattern.
If you deal with criticism by wanting to close the issue and move on fast – if you aren’t comfortable knowing about a mistake you made and not yet considering it fixed – then you have some hostility to criticism which is getting in the way of making progress. Mistakes aren’t something we should need to try to get rid of immediately in order to feel better. We shouldn’t find them too upsetting to investigate. We shouldn’t get caught up in unproductive, negative thoughts that keep us from productively investigating our mistakes for longer amounts of time. We shouldn’t mind having open investigations or rush to close our investigations. We shouldn’t be bothered by mistakes in a way that makes us need immediate resolution (and therefore prevents more substantive investigations).
Saying mistakes are fine, without strongly encouraging postmortems, can be harmful. Mistakes are not exactly fine. They shouldn’t be shameful but they shouldn’t just be accepted without fixing the problem. Something is going when a mistake is made. Being mean about mistakes is bad, but being nice about mistakes without fixing the underlying problems is problematic too. Mistakes should generally be seen as a fairly big deal, usually requiring substantial introspection and changes to fix, rather than as superficial, individual, small issues that aren’t part of patterns. Most mistakes come from underlying causes which cause many other mistakes. And when something causes 100 mistakes, we often only know about 0-5 of those mistakes, and are ignorant of the rest, so it’s important to use the evidence we have instead of being dismissive of it.
Fixing mistakes in the right way usually involves coming up with some kind of new policy, method or habit and then practicing it until it becomes intuitive and replaces the old, faulty mental processes you had before. If you just change your mind about the specific issue you were mistaken about, and nothing else, you and others should be skeptical that you’ve actually dealt with the mistake effectively. Mistakes usually involve your subconscious mind, so correcting mistakes requires taking actions that affect your subconscious mind, so just consciously changing your mind isn’t enough. The most typical strategy for changing your subconscious mind is practice.