Table of Contents
People encounter ideas, think they are bad ideas, and then want to treat those ideas badly because they're (allegedly) bad. They want their treatment of ideas to correspond to the nature or quality of the ideas. They want to be interested in good ideas, and uninterested in bad ideas. They want to spend time and energy on good ideas, not on bad ideas. They want to be curious about good ideas and dismissive of bad ideas. You probably can see how this makes some sense and why it'd be typical. There's a logic to it.
But rationality requires something different. It requires recognizing your fallibility (the fact that you could be mistaken). It requires knowing that you might be wrong. If you think an idea is bad, but you're wrong, and you treat it badly, then you've accidentally treated a good idea badly. That is common. Rationality requires we make major efforts not to do that. Few people, including philosophers and advocates of rationality, actually make those efforts on a regular basis.
Many excuses have been tried for not being very concerned with fallibility. E.g. "Lots of other people think that idea is bad, therefore I feel pretty safe in assuming it's genuinely bad." Or "If that idea was good, someone would have noticed." (Someone did notice it's good, which is why you're hearing about it. You mean someone highly respected, not just anyone. But the highly respected people tend to be dismissive of ideas that seem bad to them or which are unpopular.) Or "I don't have time to consider every bad idea." (True, but that should lead to wanting to figure out how to handle your fallibility given your limited time. It shouldn't just be an excuse to be dismissive of ideas.)
The idea of fallibility, and that you might be wrong when dismissing ideas that seem bad, is well known. People have heard of stuff like this but they keep doing it wrong anyway. Why? Because knowing a few sentences worth of reasoning isn't nearly enough knowledge to guide your life rationally.
What does it take to do better? Let's consider several options.
One approach is to study the issues in much more depth. Learn way more ideas and much more advanced nuances and details. Research all sorts of abstract reasoning and logical arguments on the topic. Become persuaded in a much more thorough, comprehensive way.
I think that would work pretty well but it's very hard. People try to do it, but it's usually not very successful. Why? I think there are two main problems. First, many just don't actually learn very much. Second, even if they do learn a lot, usually a lot of it is wrong. You have to actually get stuff right for this approach to work, which is hard, especially on a complex, nuanced topic that our society is bad at like philosophy of rationality.
Another approach is to practice. Figure out drills, exercises, worksheets, flashcards, and other ways of practicing. I think this would generally be much faster and easier than the advanced, deep study method, and would be more effective and less likely to fail. However, in short, people don't try to do it. People try the advanced study method way more. There are signs that people don't want to practice rather than that they don't know how (e.g. I've offered some practice suggestions and people haven't been very interested). I'm not very clear on why people resist practicing so much, but they do on many topics. (I have in mind primarily adults; many children are more willing to practice, but treating ideas well despite thinking they're bad isn't something that is practiced in school.)
A common issue is people like ideas about rationality and fallibility when they imagine applying the ideas to other people. They want other people to be more rational, and they want other people to be less dismissive of their ideas that they think are good but others consider bad. But when it's applied to themselves – when they are asked to treat ideas better that they think are dumb – they don't like it. Maybe that's a reason people don't want to practice this: the thing they want to happen and change is about the actions of other people, not themselves. Practicing it would involve changing their own actions, like treating ideas better despite thinking those ideas are dumb, and they don't want to do that.
Joining a Group
Besides study and practice, another option is to find a rational social group. Spend a lot more time around people that handle fallibility and rationality better. Spending more time around those people would help. The main difficulty is that most people who say they're rational aren't actually very rational. I don't think any large group of people exists today that is actually good at rationally dealing with their fallibility.
Another option is to find one person, or a few people, who are good at this stuff, and get help/mentoring from him/them. With a large group where things are being done well, you could just participate and try to fit in. With a tiny group, if you can't contribute much (since you didn't already study or practice), then it'll be a pretty unequal, unbalanced relationship. You can't just blend in with the crowd because there is no crowd. You have to actually ask people to help you, not just join a group. This tends to require them having a reason to help you, offering them something in return for help, or being a very promising person to help in some way.
Since you don't know a lot about rationality, how would you know if the person, small group or large group you find is actually good? Picking people who aren't actually very rational is one of the main ways this approach fails.
Another way of failing is not getting enough immersion. If you travel to France and join the big group of people in France for a year, being around the French language all the time helps you learn it. If you merely have two friends who speak French, but you're still in an English speaking country and spend most of your time around people who are speaking English, then it's easy to never learn French. Big groups offer better immersion than small groups, but it's still usually easy to not be involved enough to change much. There can be solutions, such as scheduled daily lessons from a specific person and assignments for additional work to do between lessons (which gets into doing some study and practice).
Rationally Dealing with Fallibility
How should you treat ideas you think are bad, given your fallibility? The answer is long and complicated. I've written a lot on topics like that. As a brief answer, I'll suggest that if an idea implies you're wrong in an important way, then it's really risky and generally unwise to just ignore it, and make a high stakes bet (that you aren't wrong) based just on your initial impression that the idea is bad. Instead, you should find a refutation of the idea that has been written down, or if none exists then write one (after considering the idea non-refuted and potentially good, investigating it, and maybe deciding it's good, but if you conclude that it's bad then you can write down your refutation).
Commonly, instead of refuting an individual idea, refutations will apply to large groups of ideas – e.g. an argument refutes any idea with trait X or fitting pattern Y. The ability for one argument to refute many bad ideas is one of the keys to saving time and effort, so it isn't hopeless to deal with all the bad ideas anyone can come up with. If you have a few dozen high quality arguments which refute common types of bad ideas, then it's actually hard for people to come up with new bad ideas that aren't already refuted by existing arguments. And if someone does manage to come up with a new idea that avoids all existing refutations, then you should be impressed by that accomplishment and respectful enough to investigate (or if you know someone else is investigating who you respect, you could just let them do it). Even if the new idea is bad, it'll help you build up your refutations so it's even harder for the next person to come up with anything new that isn't already refuted in advance in writing. Dealing with ideas rationally is much harder than this paragraph; there are many details and complications including the topic of decisive or indecisive arguments; to do this stuff well you'd have to learn much more than I say here; but this gives a rough idea of how some of it works, so you can hopefully conclude that there may be some solutions. Many people are extremely pessimistic about the possibility of refuting instead of ignoring errors, which I take as an excuse not to try, so I wanted to give enough of an answer to how to do it that someone could be hopeful and optimistic that it can be done.
Ideas shouldn't be treated badly just because you, in your fallible judgment, form a bad initial impression of them. In other words, ideas should be refuted instead of dismissed without argument. When you try to refute ideas, if you're not too biased, you'll sometimes fail to refute them and realize they're actually good. And if do find or create a refutation, other people can learn from it and change their mind too, or potentially provide a counter-argument with information that's new to you.