Saturday, February 27, 2021

The rationalist community & me

You may have heard of the "rationalist community." To the (very imperfect) extent I understand what it is, it's a group of people who profess to observe reasoned inquiry. That inquiry requires being aware of one's own biases, owning up to and being honest about the "epistemic status" (i.e., degree of certainty) of what they know, and using statistical probabilities in an approach known as "Bayesian analysis" to resolve questions. My apologies to any member of that community who might be reading, for I'm sure I've misrepresented them.

My experiences with the community are in the e-world. 

I occasionally visit the blog LessWrong, which is known as the home of the rationalist community. I've also occasionally visited the blogs of Robin Hansen and Scott Aaronson: I believe Robin is considered part of the rationalist community and that Scott Aaronson is probably "rationalist community adjacent." I much more visit often Scott Alexander's blog, both the late Slate Star Codex and the current Astral Codex Ten. I've even paid to subscribe to Scott's substack account. Scott is a member of the rationalist community, and he's perhaps the most accessible.  I also read, on an almost weekly basis, Zvi Moshowitz's blog. I find his analyses of the Coronavirus pandemic, particularly insightful.They're much more informative than the press releases issued by the various departments and bureaus of public health. (Those departments and bureaus might very well be doing some good things, but the announcements they direct to laypersons like me tend to be only minimally helpful.) Finally, I've purchased the ebook by Eliezar Yudkowsky (about $5 on Amazon). He is supposedly the "founder" of the rationalist community. I confess I've read probably only about 20% of that 1,000+ page book, but I think I have some idea of his approach.

There's a lot to like about the rationalist community. I greatly appreciate its members' willingness to approach their own priors skeptically. I love the practice of disclosing the "epistemic status" of a blog post. I appreciate even more Scott Alexander's writing. Not only can he turn a phrase. He also has the knack for "steelmanning" views with which he disagrees. (That, in my opinion, is probably the main reason he is so controversial. He presents his opponents' side so well that the casual reader thinks Scott is endorsing those views.)

One might be forgiven for thinking I identify with the rationalist community, or that I'm at least "rationalist community adjacent." That's pretty much not the case. I can't pinpoint exactly where I disagree with that community, or even if I do. But I am wary.

The main reason is, I just don't *feel* it. Something seems off. 

It seems to me that to take the "rationalist community's" rationalism to a logical extreme, we get something like scientism, but with logic. I guess what I mean is that they believe in something called "logic" in a way that the caricatured adherent to scientism believes in the material world.

Reading Yudkowsky's book, especially, drives that point home. He comes off as the type of "new atheist" bully I sometimes met in graduate school. He also seems like an ungracious interlocutor. He relates a story from a party he attended. He apparently baited someone into a conversation about something controversial, and that person (probably in exasperation) suggested they could just agree to disagree. Instead of taking the hint, Yudkowsky explains some convoluted argument about how people can never really agree to disagree. He's also a bit too quick to point out all the weak points in theistic arguments and shrugs off any of their stronger points with something like an ad hominem or a Chrsitopher Hitchens style insult.

I should say a few things to be fair to Yudkowsky and to the rationalist community. First, the occasional times I've ventured to LessWrong, I haven't noticed such a confrontational culture. Second, chapters in Yudkowsky's book, I understand, are maybe ten years old, or older. Second, in the introduction (or prologue?) to his book, he admits that in his earlier years, his approach was less empathetic and too dismissive of (and too combative against) those with whom he disagrees. (He credits Scott Alexander for that realization.) Third, in my occasional visits to LessWrong, I'm struck by the civility of that community, of which I understand Yudkowsky is still an integral part.

Still and even so: there seems little room for mystery or for deep engagement with affective values in the rationalist community.

"Room for mystery" and "deep engagement with affective values" get me in trouble. I just wrote those words and am not sure what I mean by them. (And I won't try to define them here.) I evidently mean that they're good things, and I'm evidently suggesting that the rationalist community devalues or undervalues them.  At the same time, they would probably object to that characterization. And whether they'd raise the point or not, I have to admit I lean toward wanting "room for mystery" and "deep engagement with affective values" whether or not such things exist or ought to exist.

I'm reminded of my earlier engagement with liberatarianism. That began circa 2007, when I started reading the Volokh Conspiracy and later, when I started reading Positive Liberty. Ordinary Times itself used to have a left-libertarian tilt. It still kind of does, if you squint just right, but that tilt is noticeably weaker. Those interactions have led me to be what I'll call "libertarianism adjacent." I'm not on board with what I understand libertarianism to be, but I respect the libertarian critique of government power and of the programs I do support. In fact, I believe that critique to be the sine qua non of pretty much any state-direct initiative: if a government program cannot at least account for that critique and either show why it doesn't apply or why the benefits outweigh the bad, then the program is to that degree less defensible. 

I don't, however, foresee my becoming a "rationalist community" adjacent person. But who knows?

No comments: