I’m not sure if I’ve ever admitted this on the blogs but… I was a really good at physics in college. This is like six years ago, and college courses aren’t very much like the real world, so it’s all water under the bridge now. But I never got so much as an A minus. I was consistently an outlier on tests. And I never really needed to study, because I got enough out of attending lectures.
Lectures are often derided as one of the least effective methods of teaching, since they involve no student participation. In retrospect, the reason I got so much out of lectures is because for me, lectures were participatory. Professors would solve problems on the board, and my practice was to solve the same problems in my notes, one step ahead of the professor whenever possible. Not everyone can keep that pace, so I’m not exactly offering this as advice to students. I was simply thinking about how participation is paramount to learning.
What makes blogging so valuable to me is that it is a kind of participation.
This post has been cross-posted to The Asexual Agenda.
In an earlier post, I touched on a particular problem: not only do I not conform to what is “normal”, it is far from clear what “normal” is, or if “normal” even exists.
I think this is particularly a problem for aces. Aces must often identify what is normal in order to resist it, but at the same time they don’t have direct experience with what is normal. If aces are too confident in their perceptions of the normal, this could lead to offensive views of allosexuals (e.g. the notion that allosexuals are constantly horny). If aces are insufficiently confident in their perceptions of the normal, this can lead to crippling self doubt. So here I outline my analytic approach to the problem.
Sometimes in social justice discourse, people evaluate arguments based on the person who is making them. For example, if a person argues that racism is not much of a problem today, and that person is white, we might choose to disregard that argument.
Many critics think this leads to an incoherent epistemology. I emphatically disagree. Fallacies be damned, there are many practical reasons to care who is making an argument.
On the other hand, I have certainly observed some… excesses. For example, in some cases, a person is assumed to be white, cis, male, heterosexual, based on the thing they were arguing for. In some cases, this assumption turns out to be incorrect, which creates a whole distraction. I will not comment on whether this pattern is common or uncommon, but instead outline an approach that people should be taking instead.
It’s said that in Poker, the correct strategy is to fold at least half the time. This is because in a two-player game, you’re bound to lose half the time, and folding minimizes the cost of losing.
By analogy, we should be folding about half the time in arguments too. In an argument where one person is right and the other person is wrong, about half the time, the person who is wrong is you. And if you’re wrong, then the best course of action is to change your view.
Granted, there are plenty of arguments when both people are right, or both people are wrong, or neither person is making any sense at all. Also granted, you may be the kind of person who is wise and educated, and who mostly chooses YouTube commenters as opponents.
Lastly, on consideration of the game theory, it turns out you shouldn’t necessarily be folding quite half the time.
I want to identify a common pattern that occurs across all sorts of political and quasi-political arguments. The pattern is this:
Alice opposes a certain ideology. She thinks the ideology is a significant factor in a wide variety of problems. She has many opinions and notions which are based entirely on opposing that ideology, or correcting its errors. She has a tendency to see the ideology in many people, including people who don’t see it in themselves.
I would describe Alice as recoiling from an opposing ideology. That ideology takes up a large amount of conceptual space in her mind. Many things are seen in relation to that ideology. This can be particularly frustrating when you interact with Alice, and she sees you in relation to the ideology she opposes. Alice tends to think you’re either with her or against her.
In the past, I’ve described this as Alice as being “reactionary”, since her views are based on a reaction against the opposing ideology. I switched to a synonym for two reasons. First, “reactionary” is sometimes defined as a particular variety of far right-wing politics. Second, “reactionary” has a negative connotation, and I want a more neutral term. When someone recoils from an idea, I’m not saying that’s good or bad–it could be either. Maybe Alice is recoiling too much and it blinds her. Or maybe she’s recoiling exactly the right amount.
Lots of people dislike arguments, and arguments about religion in particular. It’s often asked, “Why do you even bother telling people that their personal beliefs are wrong?” The common counterargument is that beliefs inform our decisions, and as citizens of a democratic society we are subject to other people’s decisions.
But here, I hope to get at the root issue. I believe people complain about arguments because they’ve had negative experiences with arguments. Indeed, everyone and their mother seems to have an anecdote about an argument about religion, where the other person was totally obnoxious. One theory about these anecdotes is that they were obnoxious because they were arguing about religion, and no one should ever argue about religion. Another theory is that they were obnoxious because the arguments were non-consensual.
The Dunning-Kruger Effect states that people with the lowest competence tend to overrate their competence, but people with the highest competence tend to underrate themselves. This was shown in 1999 paper by Dunning and Kruger which won an Ig Nobel Prize. Here’s one of the figures from the paper:
This figure shows scores from a humor test (where a “high” score means good agreement with professional comedians). There are similar figures for tests on grammar and logic.
The Dunning-Kruger effect has entered popular wisdom, and is frequently brought up when people feel like they’re dealing with someone too stupid to know how stupid they are.
But I have to admit that the popular wisdom led me wrong. I had a misconception: I thought that people’s self-assessment was actually anti-correlated with their competence. As you can see from the above figure, this is plainly not the original finding. People with lower competence tend to overrate themselves, but their self-assessments don’t quite surpass the self-assessments of people with higher competence.
In the comments, I had a discussion on the structure of knowledge. There are two general points of view. The first point of view, called foundationalism, is that knowledge starts with a few basic principles, upon which the rest of knowledge is built. The second point of view, called coherentism, is that knowledge is structured like a web, with inferences going in every direction.
This is a long-standing philosophical question, and you can read superior accounts from more authoritative sources.
Both coherentism and foundationalism have features which should raise eyebrows among critical thinkers. Namely, foundationalism involves believing its foundations without evidence or reason. Coherentism involves circular reasoning.
The Stanford Encyclopedia of Philosophy observes that coherentists typically defend their view by attacking foundationalism. Here I will instead mount a positive defense of coherentism by arguing for the virtues of circular reasoning. Continue reading