Table of Contents
Suppose that you’re a high status, popular intellectual. (Most of this will also apply if you’re low or medium status or popularity. It has less relevance for non-intellectuals – people who aren’t interested in ideas, rationality or truth-seeking. But you don’t need any credentials to count as an intellectual; it’s just up to your interests and actions.)
If you’re wrong about something important, and a smart person knows it and is happy to help for free, what is the most reasonable series of actions he could take which results in you changing your mind? Assume he has no social network, no social media followers, no impressive credentials, and no social status. Assume he lives far away, but he’s fluent in your primary language and uses the internet.
What is the best series of actions for you to be corrected? (Writing out several reasonable ones is a good idea, rather than just the best one. And I suggest people actually write down answers. This is not just a rhetorical question.)
Does your plan rely on him saying something that goes viral? Does it rely on Reddit voters (or Twitter retweeters, or anything similar) seeing value in his idea? Does it rely on him saying something in your blog comments and you recognizing the value in it? (Most people and ideas are ignored in blog comments, so he might not even try.)
What if you initially disagree with his great idea and think it’s dumb, but you’re wrong? What if almost everyone doesn’t already understand it and wouldn’t recognize the value right away? But what if the idea would win a debate if anyone actually bothered to debate with it. Is there a series of debates it could win, starting at the bottom and working its way up to you?
Do you have an organized way to address questions or criticisms? If you’re busy and popular, what about a way for your proxies to do it? Or just people you like or agree with? Is anyone on your side (anyone with similar ideas to you) open to debate, questions or criticisms, so that low social status person, with ideas that initially sound bad, can win debates and earn attention, so that your errors can be corrected? Is that a thing that can realistically happen?
I’ll readily grant that majority of people who attempt to debate their way up would be wrong. The majority of stuff that gets ignored, filtered out, downvoted, etc., is bad, low quality, unimportant, etc. There are more ways to be wrong than right. The majority of deviations from existing knowledge are worse not better. But some new or different ideas are good. And there are good ideas that don’t come from the ingroup of high status people with credentials or popularity.
There’s also the major problem of high status people from a different clique with something important to say to you but no good way to get the information to you. Yeah they can get a little attention so you hear a summary of their idea or maybe even read one article. But it’ll often take some study, effort, attention, debate, questions, etc., before you would change your mind. And high status people in other social structures don’t have good ways to get you to participate in that. They’re seen as threats, enemies, etc. You may well be more biased against that person and his ideas than against a person who is low status. A low status, lone thinker is less threatening than a leader from a rival tribe.
Does your plan – does the series of steps by which you get corrected – rely on the smart thinker, with the good idea, first changing a bunch of other people’s minds? And then later, after his idea becomes popular, you’ll consider it?
You won’t listen first and be a very early adopter of his better idea, but you’re willing to be semi-early adopter of the idea once a hundred people in your social group have already picked it up? But if you won’t go first who will? Does your social group have earlier adopters who will go first? Who are they? How can the smart person find and speak to them? Are they labelled in a clear, public way which is visible and understandable to outgroup members? And is there any clear path from those earliest adopters changing their minds to other people, like you, changing your mind? And do good ideas actually propagate reliably within your social group? What if it’s hard to understand so the early adopters are like “this seems pretty good” but they can’t relay the idea to you well enough, because they haven’t learned it well enough to teach it to you? Will you listen to the outsider, who says some things that clash with your culture and sound dumb or annoying to you, just because some of your group’s early adopters thought it had merit, even though they didn’t translate all of it into culturally acceptable speech that’s easier for you to listen to?
Does your group have consistent early adopters who are accessible to the outgroup and filter ideas for you? Do you know who they are and pay attention to them? What if they are making some mistakes and filtering out some categories of good ideas? What if they have some systematic biases? Why do you trust them?
Does your group have no specific individuals with this role, and instead it’s a decentralized group effort? For any particular new outsider idea, there’s no predictability about who in your group might find it, understand it, and spread it? For the last several ideas that this happened with, was it different people each time? That approach has upsides but also downsides. Since no one has the role of dealing with new ideas from outsiders, some may be ignored by everyone in your group. (By ignoring I have in mind both not being aware of it at all, as well as taking a quick look and having an initial impression that it’s bad, but without going into the kind of detail or debate necessary to actually find out how good it is and be corrected if you’re wrong.)
If no one in your group takes responsibility for addressing criticism and questions about your groups ideas, I don’t think that can be fixed by having many people sporadically do it when they individually want to. If no one is doing it systematically, that means basically that each person will do it when it looks promising to him. So there will be a systematic bias against outgroup ideas which do not appear promising initially to people with your group’s biases.
Could it be that there are some people in other groups (or independent/unassociated) who are open to debate about some matters, and interested in debate, and correct, and no one in your group will debate?
Do you view your group as an organized group? Do you look at its structure and how it operates? Or do you think of yourself as a lone individual who is only loosely associated with several tribes? That’s fine in many ways, but with no clear group roles you need to do more things yourself since you can’t rely on others to do them. If no one else has the same views as you and no one can function as your proxy in a debate, then you need to be much more open to debate yourself, personally. If you don’t join a group that handles various things for you, and want to handle stuff independently yourself, then you better have a plan to handle it. If no one else is putting out FAQs or addressing critics for you, because you want to be independent and have your own ideas, then you need to figure out how to do that stuff. If that’s too hard or resource intensive, then consider joining a group with resources and other people to help. If you can’t find any decent group with lots of resources and good organization, fine, no problem – I can’t either – but say that openly and visibly, transparently do your best with the resources you have. And you better be open to debate and new ideas since you’re going it alone with your own personal unique ideas that could easily have tons of errors. And you should write some criticisms of some groups you didn’t join. Share some analysis of how they’re closed to ideas. Explain your choice to reject existing groups.
Ambiguous, Partial Group Membership
I think where tons of people go wrong is by trying to have it both ways without clarity about what’s going on. Partly, they are loosely associated with several groups/tribes. And partly they are independent.
So they think good ideas could spread in any of those groups and reach them. But they don’t carefully examine whether any one of their groups does that well. If one group is criticized they will claim that it’s OK that it has weaknesses since they pay attention to ideas in other groups. There are many paths for corrections and insights to reach them. And anyway they are an independent thinker who can’t take full responsibility for any group – they have some disagreements with each group. So if the group is wrong, that doesn’t necessarily mean they are wrong.
They don’t even make any clear, decisive choices about which groups they are members of, and the groups are not clearly defined. And they can claim independence to stay away from anything bad, but also claim to have many group associates to avoid all the burdens of “ok you’re alone; make all the stuff work yourself”. So there are all these very loose, poorly organized groups, which don’t take responsibility for debate and answering questions. And then you have a person who has varying levels of involvement with a dozen of those. And he likes his independence when convenient.
The moment you say “OK so you’re alone and have some claims and there’s no one but you to defend them, so you better be open to debate” he says stuff like: “Nah, if it’s Popper related there are many other Popperians who may or may not debate you, so why should that be my job? And if it’s self-help related, there are books on that and I’m sure there exists some forum somewhere, so it’s not my problem to worry about your innovations in self-help that don’t sound immediately promising to me. And if it’s about drilling for oil, which is my profession, I have plenty of great discussions about that behind closed doors, and my company does have a public mailing address where you can send suggestions. And if it’s about physics, which I published an academic paper on, there are many other physicists so probably someone else has considered your idea, or you could convince someone else, or you could get past the gatekeepers and publish a paper in a journal I read. And if it’s about rationality, try Twitter or Reddit; why should I personally care? There are plenty of other people you can talk with.”
He won’t take responsibility for anything, nor will he send you to speak to anyone who will. None of the things have reasonable ways to get debates concluded. Often they have no reasonable way to get much debate at all. Rather than examine any one mechanism for ideas to spread, and paying careful attention to what’s wrong with it, they will say that there are, amorphously, many other options. Which is true. But which one works well?
There are plenty of mechanisms to spread ideas that sound great to lots of people right away. If it’s short, simple and immediately appealing, it can spread fine. But many good ideas are counter-intuitive. Many good ideas sound wrong or bad because they clash with people’s existing habits and biases. Many good ideas challenge some authority or established power. They can threaten reputations, careers and egos. They can create huge sunk costs – they can mean that a lot of the training of a whole profession was a waste. If you spent years learning something in expensive schooling, and it’s wrong, you’ll need to learn new things. Most people are very hostile to that kind of thing and will use many defense mechanisms, including irrational ones, rather than face it. How will non-initially-appealing ideas spread?
People say you can break ideas down into parts, each of which is easy, non-threatening and appealing. Each little bit gives the listener a viable way to change, and then many little things can eventually add up to the whole idea. There are flaws with this approach. One of the big flaws is that, often, the components don’t offer enough value. You can learn the idea one bit at a time, but it’s the whole idea which is important; the bits are important as steps towards the big idea but not all of them are very useful independently, alone, in isolation. You can’t always take an important idea and break it into tiny components, so it’s easier to learn incrementally, and have every component be important in isolation. Each component can have some positive value by itself, but not necessary enough to outcompete everything else people could pursue. So either you have to tell them what’s being built towards – an important conclusion they don’t want to hear – or else why will they care and pay attention to learning each little piece?
Initially appealing ideas and social status hierarchies are related issues. When an idea is favorable to social power, it can easily spread because people gain status by spreading it. But when it challenges power, most group members attack it as the evil outgroup because opposing it is the way to gain or maintain their social status. So it’s hard for it to spread because people are bandwagoning against it.
If the idea is awesome and super valuable and people understand it right away and see the value, it can spread despite challenging power. It can go against the flow. But most great new ideas are harder to understand. The value isn’t so easy to extract. It takes some work. The payoffs can be very worthwhile – much larger than the effort – but they aren’t immediate. It’s very hard for ideas like this to spread in a social group that sees them as the enemy. The members will say it’s bad and not put in the work to understand and use it better. Why? First, they don’t know if it’s actually good – there are many bad ideas claiming to be good ideas that take some effort to learn. They don’t have good mechanisms to figure out which are which. Second, even if they learn it successfully and it’s good, the likely result is they alienate themselves from the group. If tons of people in the group learn it, great, learn it too. But if you’re an early adopter, how do you know the majority in your group will ever adopt the idea, even if it’s good? There are many reasons to doubt that they will.
What can be done about these problems? Individuals can take personal responsibility for having methods of filtering ideas which don’t just block some good new ideas with no reasonable way for that to be fixed. It’s one thing not to think of every great new insight; that’s way too hard; it’s another thing if someone has some insight, and is happy to share, but there’s no way you’ll listen.
Individuals can also stop being so tribalist and spending their lives climbing social status hierarchies in groups, especially in loosely organized, poorly defined groups with no formal structures, clear rules or clear membership.
Individuals can pressure groups, tribes, status hierarchies, individuals, etc., to think about and set up mechanisms for error correction to happen and for new ideas to spread. Effort can be put into making this stuff work.
Stop assuming that the good ideas will float to the top. They systematically do not. The current systems are not designed to make that work. There are widespread problems which suppress good ideas. You can’t just assume that every idea will get some attention, and the good ones will then be shared some and get more attention, and the great ones will be shared even more and get more attention, etc. That doesn’t work because there are systematic biases, no good places to initially share ideas to get a fair hearing.
Non-transparent negative judgments, that aren’t backed by debate that addresses the issues, are a bad system. Disliking stuff doesn’t mean it’s wrong. Reasons and arguments are needed that address questions and criticisms. And then followups are needed. Many ideas don’t work well to express all at once with no back-and-forth communication. It can be a lot easier to argue a point if someone will actually answer questions about their current beliefs and commit themselves to some specific alternatives. Trying to preemptively address every alternative that someone might believe is a lot of extra work that makes your writing way too long, and it can never be done perfectly enough to satisfy people who are strongly biased against what you’re saying.
People need ideas about how to reach conclusions in debate (without using too much resources like time and effort). Stop thinking debates usually aren’t very fruitful (granted) so it’s OK to just judge stuff for reasons you never share and just believe what seems good to you. That’s a great recipe for your biases to flourish and never be challenged substantively because you never stop and put in work to understanding things that conflict with your biases.
But, you tell me, you do challenge yourself sometimes! What often happens is some things seem kinda challenging but aren’t real threats, and those spread so people can feel good and rational. People like to face challenges that they’ll beat. People like to debate people they’ll beat. They like to face rival ideas that they’ll beat. Doing that is not a way for your biases to be overturned and for you to change your mind substantively.