Why Rationalists (sometimes) Madden Me

Eliezer Yudkowsky wrote a short Facebook comment this week. It’s not really fair to take him so seriously and comment on it here, but I’m going to anyway, because it highlights why rationalists (sometimes) drive me crazy.

This is the guy who wrote Politics is the Mind Killer, and is highly respected for his insights on human bias and rationality.

https://www.facebook.com/plugins/post.php?href=https%3A%2F%2Fwww.facebook.com%2Fyudkowsky%2Fposts%2F10154691744344228&width=500

My (cheeky) response:

I know you’re probably joking, but there are other emotional and civilized considerations behind these types of choices. Rationality can’t override all biological subroutines, and these are the types of choices that can send off very poor signals, which have the potential to deeply damage existing and future relationships.
“Wow, you have a really nice place! How did you afford it? I’m glad I asked you out on a date!” “Oh, I was a prostitute when I was younger. Only once though. I made lots of money because I’m so rational.” “Wow you’re such a rational women! What a wonderful quality to find in a mate.”

The idea behind Eliezer’s rationalism is that humans are biased emotional machines, but if we think in a structure parameterized fashion, we can evaluate choices in such a way that we end at different outcomes than our base instincts. It’s fundamentally using applied scientific methods to live our lives. This is a great idea, as most people make choices on base instinct.

The litmus test though has to be a clear acknowledgement of our bias when the truth is revealed. The best examples are those that highlight a consistent bias in our neural network approximation, which is put on display by a structured model.

For example, sometimes apartments will give renters a choice of either the first month free *or* a cheaper rate. One of these choices is mathematically optimal, and if someone makes the wrong choice they are wrong.

Retirement funds can be slightly more confusing. If someone puts 100% of their funds in a bond index, that seems like a really bad idea. We can’t tell them their risk preferences are wrong though, it’s their preference. What we can say is “Your risk preference would have to match this distribution to justify your choice: Does it?”

See, in these cases someone who isn’t thinking too hard, or isn’t smart enough, or doesn’t care, uses their brain’s untrained base model to spit out an answer. Our brains love to spit out answers, we didn’t evolve to develop espistemic reverence. Then someone else builds a more refined model that points out that there was a clear bias, which can be easily represented mathematically (or at least in a very structured manner).

I’m being a little lazy here for two reasons: 1.) Someone could point out that I’m assuming the person who made the initial biased error is smart enough to observe and learn that they were initially mistaken. And 2.) That I’m assuming there is a discrete distinction between self-evident mathematical biases and other types of bias. I can live with this laziness for now.

In the case of prostitution, let’s start by mapping out the dynamics of the situation. There is around a $120,000 premia for a young (presumably attractive) women to sell her virginity. Why? Well, it is because she is only able to do this in Nevada, must be willing to suffer through the act, and potentially messes up her personal and familial relationships. If these things risks didn’t exist, and we all lived in a sexually liberated world where no one cared about prostitution, it’s safe to say she’d get an order of magnitude less for the trade.

In short, the price is currently the equilibria price for a marginal sale conditional on these expected costs. In this case it only makes sense if you have a good reason to believe your expected costs are lower than the revenue you would receive.

Eliezer, however, seems to think and imply this trade isn’t happening enough. Because, unlike rationalists, people haven’t thought through the trade-offs enough and are making choices based on their emotional biases.

For what group would these expected costs be low? Well, the daughters of rationalists would know their parents wouldn’t judge them negatively, and if they are part of the community they could avoid lots of negative social repercussions. It only stands to reason men like Eliezer should encourage their daughters to sell their virginity.

And to what sane man does this sound acceptable? For now, we are not computers, we are men. I understand that my brain runs biological algorithms that are beyond *ahem* pure reason. This doesn’t mean that the feelings aren’t real. I can concede the point that combining intellect, rationality, and base instinct is challenging. It’s hard to tell where we draw the line. For me, this is that line.

I’m a science obsessed guy, so It always feels strange for me to appeal towards an appreciation for a moral civilization and intuitive ethical preferences. As I’ve grown older though I don’t see these two as in contradiction. If humans are nothing more than a special case of a robot, than our collective programming could result in some optimal and suboptimal outcomes at the societal level.

There is no real way to prove any of this, but in my optimal society our women are never encouraged to sell their virginity. I don’t know. It’s what feels right to me. I don’t consider it irrational.

I have, however, now come up with a better definition of Neo-Reactionary: A Rationalist who doesn’t want to pimp out his daughter.