There’s been a fuss in some circles about the fact that Curtis Yarvin was uninvited from a tech conference after the organizers learned of his political views, which he publishes under the pen name “Mencius Moldbug”.

I don’t particularly want to discuss his political views or whether he should be invited to speak at conferences; rather, I want to point out something I haven’t seen anyone else point out. But before I get to that, I feel like I should provide a little background for those who have been lucky enough not to encounter the “Moldbug” oeuvre.

If you like, you could start with The Baffler‘s article Mouthbreathing Machiavellis Dream of a Silicon Reich, or RationalWiki’s article about the neoreactionary movement.

In case you think those seem like pieces of slanted character assassination, let’s go to the source:

If I had to choose one word and stick with it, I’d pick “restorationist.” If I have to concede one pejorative which fair writers can fairly apply, I’ll go with “reactionary.” I’ll even answer to any compound of the latter – “neoreactionary,” “postreactionary,” “ultrareactionary,” etc.

Restorationism is to fascism as a bridge is to a pile of rubble in the riverbed. Bridge collapses can be dangerous and unpleasant, but that doesn’t make bridges a bad idea.

So fascism was a great idea, says the neoreactionary, it’s just that the Nazis did a bad job of it. He then proceeds to explain that democratic government could be declared bankrupt, and the nation handed to a corporate Receiver:

The best target for the Receiver is to concentrate on restoring the Belle Époque. This implies that in two years, (a) all systematic criminal activity will terminate; (b) anyone of any skin color will be able to walk anywhere in any city, at any time of day or night; (c) no graffiti, litter, or other evidence of institutional lawlessness will be visible; and (d) all 20th-century buildings of a socialist, brutalist, or other antidecorative character will be demolished.

No doubt there will be the usual purges of degenerate art too, though apparently this neofascism will refrain from the death camps that gave the old kind a bad name?

Yet reading carefully, there are hints of familiar racial politics:

Obama, Prince Royal of the Blood, beloved by all God’s children but especially the colored ones, from Bolivia to Clichy-les-Bois? What is he, the second coming of Comrade Brezhnev?

And “Moldbug” has a lot more to say about “colored people”, as he calls them. He’s deeply concerned about the “race rights” he feels are given to some college applicants, and the possibility that people are committing “race fraud” to get those special benefits.

Again, I’m not going to discuss why he’s wrong, someone else can take on that miserable task. However, I can’t help pointing out in passing that when he says:

Race, of course, is hereditary by definition.

…he is, of course, completely wrong. The idea that “race” is something genetically determined and hereditary is a common misconception. In fact, the consensus of geneticists is that race is a social construct. While the phenotypical variations which we use to judge and class others by “race” — such as skin color — are genetically determined, there’s no simple mapping from genotype to “race”. Israeli Jews and Palestinians, for example, seem to be genetically indistinguishable. Meanwhile, two African-Americans may be more genetically distinct from each other than one of them is from a random white person.

The neoreactionaries are no fans of science in general, associating it with ivory towers and Stalinism. But I get the sense that they want to keep alive the outdated racial “science” popular during previous periods of fascist rule. I wonder why that is?

In another article “Moldbug” sets out to defend white nationalism, and explain why he isn’t a white nationalist:

At its best, white nationalism offers a sensible description of a general problem. This problem certainly exists, and it falls under the larger category of bad government. […]

But white nationalism offers no formula at all for how to transition from bad government to good government. Indeed, to the extent that white nationalism succeeds in anything, it motivates its enemies, keeping everyone stuck in the same old destructive patterns.

And the worst thing about white nationalism, in my opinion, is just that it’s nationalism. Nationalism is really another word for democracy – the concept of democracy makes no sense except as an algorithm for determining the General Will of the People, that is, the Nation. And whatever its electoral formula or lack thereof, every nationalist government has seen itself as in some sense a representative of the Volk.

He thinks white nationalism correctly identifies a general problem, though he is coy about spelling out what exactly it is. However, he considers the white nationalists to be no good because they believe in solving the problem though existing political systems. The white power crowd are simply too democratic for him. What a twist!

So, should someone who is that much of a political extremist be invited to — or uninvited from — a tech conference? (Before answering, note that the conference in question is entirely privately organized and funded. They are free to invite and uninvite whoever they want — there is no First Amendment issue here.)

I honestly don’t know. But some have argued that Yarvin’s politics clearly should not be an issue when considering his software projects, that the two should be kept totally distinct in our minds. I disagree with that, because of the point I want to make in this article:

I don’t see Yarvin’s politics as being unconnected with his technological views.

To understand why, let’s move over to the world of technology and look at the software Yarvin gets asked to talk about: Urbit.

He has taken down many of the documents about the project, but he has enough of a fan following that plenty of other people have written about it, and there’s still an intro document on GitHub:

Nock is a stateless virtual machine defined in 200 words. The Nock machine is sealed – all execution is “pure.” Nock’s goal is extreme commoditization of computing semantics.

Hoon is a high-level language which defines itself in Nock. Its self-compiling kernel, 7000 lines of code, specifies Hoon unambiguously; there is no Hoon spec. Hoon can be classified as a pure, strict higher-order static type-inferred functional language, with co/contra/bivariance and genericity. However, Hoon does not use lambda calculus, unification, or other constructs from “PL theory.” Hoon also excels at handling and validating untyped data, a common task on teh Internets. Its syntax is entirely novel and initially quite frightening.

Arvo is a deterministic functional operating system defined in Hoon. While still basically a toy, it can serve web apps and network securely with other Arvo instances. An Arvo instance is designed to be a simple independent computer in the cloud.

Urbit attempts to rebuild the entire Internet stack with a form of functional programming. Yet it doesn’t use lambda calculus, or concern itself with such decadent trivialities as specifications. It dismisses the last 60 years of computer science theory and attempts to start again from ground zero. When I first read about it, I thought it was either genius or madness.

But having thought about the principles Yarvin bases his political positions on, I’ve realized that there’s a commonality between his politics and technology.

In both the technological and political spheres, Yarvin’s position seems to be that current systems are failing, corrupt, and degenerate. In both cases he advocates that we should tear down everything and start again from the ground up, with a revolutionary new system of total ideological purity.

In the case of both fascism and functional programming, apparently similar attempts have failed in the past, but we will no doubt be told that they only failed because they weren’t carried out properly; that they became corrupted by impure influences. For instance, there’s a section in the Urbit introduction where the necessary evil of calling device drivers is discussed — to be implemented by temporarily recognizing I/O and calling C code until we can bootstrap our way into the glorious pure Urbit-only future and carry out a grand purge.

I’m not saying that functional programming is all mad reactionary extremism. I was in love with Lisp during my college years, and we still see each other from time to time and remain on good terms. But sadly, there are some people who learn about functional programming and seize upon it as religion. They decide that it’s the only good way to construct programs, the solution to all our current problems (maintainability, parallelism, reliability, scalability, and so on). They become FP crazies:

Or as xkcd put it:

Functional programming isn’t alone in this tendency. I’m old enough to remember the Object Oriented Programming crazies of the 1980s and early 90s, who treated OOP as religion. There were multiple attempts to build a whole new OS from the ground up using entirely Object-Oriented code. IBM and Apple had Taligent, Apple had another OS project called Copland — both failed. Apple also had a third attempt at an OO OS for the Newton, and that failed too. When Apple finally found a workable desktop OS to replace the decrepit MacOS, they got it from NeXT — and it was a high level OO framework layered over a conventional BSD Unix written in C. These days, the conventional wisdom (as expressed by Linux Torvalds and others) is that C++ doesn’t belong in core OS design.

The thing is, I don’t do religion. I failed to become a functional programming nut, and I also thought C++ was pretty awesome for a while but eventually came to realize its major shortcomings. My technological philosophy is that there is no single best programming methodology — not functional, not object oriented, not procedural. Sometimes OO is the best fit for the problem, sometimes functional is the best fit for the problem, and sometimes you just need a state machine. And don’t even talk to me about there being a single best programming language.

Once a mathematician or physicist becomes sufficiently famous, they start to get letters from cranks. They become adept at spotting crackpot letters. One of the hallmarks of crackpottery is that it often claims that current mathematical consensus is entirely wrong, and that the author is a genius who has worked in isolation, overturned everything, and started again from scratch with a whole new paradigm. Throw away Special Relativity, here comes TimeCube! Forget Quantum Mechanics, here’s a new form of Newtonian clockwork physics that works! Let’s throw out thermodynamics and power the world using perpetual motion, this time we’ll do it right!

The thing is, that’s not how progress works. That’s not how scientific progress works, it’s not how mathematical progress works, it’s not how technological progress works, and it’s not how political progress works. Real, lasting progress is a messy business filled with failure, wasted effort, impurity, compromise, and building on progress made to date. Sure, every now and again you throw out a small piece of the structure, but tearing down the whole thing in a grand Year Zero isn’t a recipe for progress at all.

I’ve already mentioned several failed operating system projects, but there’s another project I can’t help thinking of when I ready about Urbit. Back in 1960, a group of extremely talented computer programmers hid away in corporate isolation and set about trying to reinvent network computing from the ground up. They planned a system with automatic distributed reliability, no central naming authorities, location transparency, and a giant distributed global storage and computation system. Sounds kinda like Urbit, huh? Development was carried out in utmost secrecy, largely ignoring the rest of the computer industry. Like Urbit, the project developed its own weird language: tumbler lines, zipper lists, enfilades with dsps and wids, poonfilades and granfilades, berts and ernies. And while the project was massively influential and originated many great ideas, 30 years later it still hadn’t shipped, because as I said before, that’s just not how progress happens in the real world.

Instead, we (eventually) got the World Wide Web. It was a quick hack based on some of the grand ideas; it ignored some important problems, put off a lot of issues to be solved in the future some day, and used existing technology. But here’s the thing: it shipped. It was useful. It was flaky, yes, but it worked well enough to utterly transform our lives.

Obviously the “tear it all down and start from something pure” viewpoint is very appealing to a certain kind of mathematically inclined person who inhabits the autistic spectrum. However, I don’t necessarily think it’s something we should encourage. While the Urbit project may incorporate some interesting ideas that computer science can learn from, my considered opinion is that its broader message and aspirations are delusional.

The best way to prove me wrong, of course, would be to deliver a working useful clean-stack Urbit system that is clearly superior to our current messy system of kludges that keep breaking. But winning everyone over in that way would be democratic, so I suspect Yarvin and his fans don’t consider it a goal which should even interest them. They are content to build their Wewelsburg castle in the air.

Meanwhile my message — that nobody has all the answers, and that we can’t start again and build a clean new perfect world (or even a better Internet) — is hardly likely to set the world on fire. While I believe in democracy, my message is deeply unappealing and will be read by you and six other people wandering the marketplace of ideas. Meanwhile, “Moldbug” expresses contempt for democracy, but his message is seductive and he has hundreds of devoted followers. How’s that for irony?

I’d be remiss if I posted a whole article about neoreactionaries without mentioning one more possibility: maybe “Moldbug” is actually satire, or a piece of Andy Kaufman style performance comedy. It’s possible, I guess, but I can’t help remembering that the Nazis seemed like a joke in the cabarets of 1920s Berlin.

Apparently there are still a few people unclear on whether net neutrality is a good thing for innovation and freedom of speech. Let me clear that up by looking at who’s on each side, excluding political mouthpieces.

Netflix, Google, Microsoft, Kickstarter, Reddit, Vonage, Amazon, Yahoo, eBay, Dish Network, Etsy, Facebook, Tumblr, Dropbox, Automattic (of WordPress fame), BitTorrent, Mozilla, and Level 3 (probably the biggest Internet backbone interconnect company) were all in favor of net neutrality. If you can think of an innovative Internet company, do a search and you’ll probably find they wanted net neutrality.

As well as being supported by practically every actual innovator and by the people who supply all the backbone bandwidth, net neutrality was also supported by the people who literally invented and built the Internet, like Vint Cerf and Tim Berners-Lee, and by tech experts like Bruce Schneier.

Also in favor of net neutrality: Just about every group that campaigns for freedom of speech and consumer rights. Amnesty International, Consumers’ Union, Writers’ Guild of America, the EFF, the Free Software Foundation, and the ACLU.

Gun rights groups were initially in favor of net neutrality, but dropped support for purely political reasons — liberals were in favor of it, so they decided they had to be against it.

On the other side, Comcast, AT&T and Verizon were all against it, along with the tech companies like Cisco who sell equipment used to filter and censor your Internet connection. When Ted Cruz called it Obamacare for the Internet he was surprised when even his supporters turned on him — polls suggest that it’s not a partisan issue at the individual level, and Republicans favor neutrality as much as Democrats.

Of course, massive right-wing communications corporations like News International (Fox) are against net neutrality, but so far they haven’t managed to sway their audience, even though they’ve tried hard. The anti-neutrality forces also spent about 3× as much money as the pro-neutrality side trying to bribe politicians to kill it.

I think that should answer the question of which side to take, yes?

Internet cats unite for Net Neutrality
Free Press via Compfight

The New Republic recently carried an interesting article about Apple (the full text may be available via Readability). The piece started out as a review of the Steve Jobs biography (ho hum), but soon diverged into a discussion of the morality of design. It helped me to crystallize some thoughts.

There’s a famous anecdote about how Steve Jobs spent weeks making his family discuss what they wanted from their washing machine.

Jobs’s meticulous unpacking of the values embedded in different washing machines, and his insistence on comparing them to the values he wanted to live by, would be applauded by moralistic philosophers of technology from Heidegger to Ellul, though it may be a rather arduous way of getting on with life. But Jobs understood the central point that philosophers of technology had tried (and failed) to impart: that technology embodies morality.

Emphasis mine. Technology may be morally neutral in the abstract, but when we make technology choices, we are making moral choices, either because of the details of how the technology is made, or because the technology filters moral possibilities.

The problem was that Jobs, while perfectly capable of interrogating technology and asking all the right questions about its impact on our lives, blatantly refused to do so when it came to his own products. He may have been the ultimate philosopher of the washing machine, but he offered little in the way of critical thinking about the values embedded in the Macintosh, the iPod, and the iPad. When he discussed his own products, he switched from philosophical reflection on the effects of consumer choices to his Bauhaus mode of the vatic designer.

I would put it this way: Towards the end of his life, Jobs took his passion for product design in the autocratic and paternalistic mode, and applied it to everything about the products he oversaw.

“Steve believed it was our job to teach people aesthetics, to teach people what they should like,” [one of his ex-girlfriends] said.

This is the real reason why the App Store exists. This is why iOS is locked down, and why the Mac is being moved to an App Store model. Sure, the revenue stream is welcome, but it’s really about paternalistic control.

“It just works”—Jobs’s signature promise at product launches—was soothing to a nation excited and addled and traumatized by technology. Nothing could go wrong: Apple had thought of everything. The technology would work as advertised; it was under total control; it would not get hacked.

This is the new Apple philosophy. Sacrifice control to paternalistic Apple, and you can relax. The benevolent leader will teach you what to like and what not to like, keep you safe from danger and ugliness. The fact that this philosophy is utterly opposed to the values expressed in so much Apple advertising is remarkable, and shows how cunning and slick their advertising and marketing people really are.

People fall for it, too. I know many self-professed libertarians who believe in absolute freedom of speech and say that they trust nobody to be a censor, but who nevertheless line up to buy iPhones and iPads and give Apple control over what software they can run on their phone, what books and magazines they can read on their tablet, even how they are allowed to arrange app icons. (Try removing Newsstand from your iPad.) Business travelers with iPads complain all the time about being forced to submit to the TSA when they take a plane flight, but what is the App Store if not the TSA of software?

Some iOS users engage in doublethink, recasting their lack of “freedom to” as a positive “freedom from”. (“Sure, I’m not free to download a wifi scanner… but I’m free from viruses!”) It’s true, all apps have metaphorically gone through the scanner and had a minimum-wage drone check their boarding pass, and you can be sure they aren’t carrying bottles of water that compete with the drinks sold by the gate, but that’s not how real security works.

Some iOS device owners ease their sense of guilt by rooting the device, ignoring that they’ve already cast a powerful vote for loss of freedom by buying it. Most, however, seem content to live in cognitive dissonance, apologetically pointing out that Apple hasn’t been that bad a dictator, and has mostly not eliminated competing services. I mean, yes, they’ve forced other magazine and book sellers to move their stores to web only to escape Apple control, but so far they haven’t blocked those web sites, so it’s OK, right?

Which brings us to the web. Criticize the lack of freedom represented by the iOS devices, and before long you’ll likely be told that it’s simply not a problem, because there’s a web browser. Sure, Apple says no porn on the iPad, but you can get porn on the web via Safari so somehow there’s no censorship occurring. But people are pointing out that Apple’s ‘app economy’ is increasingly threatening the web itself. Apple (and other corporate entities like Amazon) are managing to mold the web to be what they want it to be. And that doesn’t appear to be what I want it to be.

[…] Jobs outright rejected the possibility that there may be a multiplicity of irreconcilable views as to what the Web is and what it should be. For him, it is only a “direct-to-customer distribution channel.” In other words, Jobs believed that the Web is nothing more than an efficient shopping mall, and he proceeded to build his business around what he believed to be the Web’s essence.

Some people even claim that the web is dead, and that as we move into a post-PC era of tablets and phones as the primary Internet access devices, the web will be replaced by apps. And freedom will be replaced with complete corporate control.

Our choice is between erecting a virtual Portland or sleepwalking into a virtual Dallas. But Apple under Steve Jobs consistently refused to recognize that there is something valuable to the Web that it may be destroying.

A virtual Dallas, a prospect that will make every Austin web developer shudder.

So I now realize that this is where I parted company with Apple. When the Jobsian paternalism was restricted to matters of hardware design, I mostly appreciated it. I wish my laptop had a replaceable battery and anti-reflective screen, but mostly I’m happy with what I was told I should like—the large trackpad, the solid metal casing, and so on.

But when the paternalism was extended to books and movies and video games and applications, and when it started to threaten the web—well, that was several steps too far.

Everyone says they love freedom, and that freedom is important. But as the cliché says, “freedom isn’t free”. Freedom means ugliness. Freedom means danger. Freedom means complexity. Apple, in a stroke of marketing genius, offers you freedom from those things. And by accompanying that promise with images of freethinkers and a ‘think different’ message, it manages to make you overlook the fact that what you are really doing is giving up your freedom, and financially rewarding the very entity you are giving it up to.

So what’s the alternative? Well, sadly you won’t find a mobile platform with a rich ecosystem that doesn’t require ceding some control to others. Many people have said to me “Well, since that’s the case, what’s the point? I might as well go with the best.” But I’m not an absolutist; I don’t believe in the idea that if you can’t be perfect, you might as well not try. Rather, when it’s time to make a choice, I’ll choose the imperfect option that’s better.

Even Google, with its naïve technocratic ethos, is more committed to questioning the impact that it is having on the Internet and the world at large. They fund a bevy of academic and policy initiatives; they have recently launched a Berlin-based think tank dedicated to exploring the social impact of the Internet; they even started a quarterly magazine. […] Apple, by contrast, holds itself above the fray. It seems to believe that such discussions of meanings and consequences do not matter, because it is in the design business, and so its primary relationship is with the user, not with the society.

And then there are things like the Data Liberation Front, AOSP, and the periodic table of open APIs. You can even run Android devices without Google, pretty much. Try using a new iPad without an Apple ID.

So until something better comes along, I’m going with Android for my phone and tablet needs. Freedom is too important. Google might not be perfect, but in the specific area of mobile platforms, they are a lot better than Apple.

It’s Everybody Draw Mohammed Day, and as the festivities continue, predictably there are lots of well-meaning people saying that we should all put our pens and pencils down and not offend all those awfully nice Muslim folks.

An article at Huffington Post suggests that Muslims are being singled out, and that black religious extremists would never be ridiculed. I’m pretty sure I’ve seen the Black Panthers parodied in numerous movies. Maybe Everybody Draw Mohammed Day makes peaceful Muslims angry, but we wouldn’t have the day if it wasn’t for the non-peaceful Muslims, so maybe it would be more productive to focus anger on the cause of the issue, rather than the reaction to it.

It may superficially seem like a good idea to refrain from drawing pictures of Mohammed in order to avoid offending all the nice Muslims out there. However, it’s equally sensible to refrain from depicting sex, in order to avoid offending all the nice Christians out there. We should definitely stop mocking the Pope’s mis-steps over sexual abuse by priests, to avoid offending the nice Catholics. It makes just as much sense to avoid any nudity, in order to avoid offending all the nice Mormons out there. Let’s not forget the atheists either–let’s avoid drawing crosses or Jesus fish, let alone wearing them. We’ll need to get rid of beef and depictions of beef dishes, in case we offend all the nice Hindus, and get rid of images of pigs which nice Muslims also find offensive enough to complain about. Public displays of affection are offensive to people in many countries, so we’d best put an end to wedding photographs of the bride and groom kissing, we wouldn’t want to offend anyone…

Getting the picture? As soon as you start self-censoring because of the passive-aggressive demands of someone who is offended by mere images, there’s no end to it. The right answer, and the only answer which preserves essential freedom of speech, is to tell people that if they find the sight of something offensive, they are welcome to stop looking at it.

This isn’t a rule that only applies when I’m offending other people. I’ve received well-meaning e-mail encouraging me to whine at Discovery Channel to cancel Sarah Palin’s TV show, or to complain to advertisers and ask them to stop supporting Glenn Beck. I find both Glenn Beck and Sarah Palin offensive and insulting, but since I don’t have to watch either of them, I’ve learned to get over it. I’m not being “punished” by the continued existence of Fox News, and you’re not being “punished” if I draw a crude picture of Mohammed.

Now, if you want to argue that I shouldn’t deliberately post pictures of Mohammed on Muslim discussion forums, or print out posters and stick them on the wall of the nearest Mosque, well that is a more reasonable request. But total self-censorship to avoid the possibility of offending others? Not workable. Too many people get offended by too many things. In fact, if you can go an entire day without being offended by something, I think there may be something wrong with you.

If you don’t want to see a picture of Mohammed, don’t click the appropriate web links. If you think you might be offended by South Park, don’t watch it. If you think everyone should wear magic temple underwear at all times, don’t go to the local swimming pool. And since I don’t want to watch Sarah Palin shooting wolves from a helicopter, I’ll skip her show. OK?

Joel Johnson, Gizmodo, 2010-02-03:

It’s taken me a couple of days for me to understand the wet sickness I felt in response to all the post-iPad whining, until it finally came up in a sputtering lump: disgust.

The iPad isn’t a threat to anything except the success of inferior products. […]

This noxious attitude has permeated our tech culture for the last couple of decades, from a half-decade of open-source devotees crying about Microsoft on Slashdot, on toward the last few years of Apple ascendency. It’s childish. It’s defeatist. And it shows a simultaneous fear to actually innovate and improve while spilling gallons of capitulative semen to a fatuous, dystopian cuckold wank-mare. […]

Apple is selling a product. They’ve chosen to keep it closed for demonstrably reasonable benefits. And—yes, okay!—several collateral benefits that come from controlling the marketplace that services their products.

Three weeks later, Joel Johnson, Gizmodo, 2010-02-23:

If you need another example of why the iTunes App Store’s walled garden is flawed, Apple has been only too happy to oblige, capriciously and arbitrarily removing an unknown number of “sexy” apps without warning. […]

With a closed ecosystem comes a lot of responsibility. Apple has taken on the heavy mantle of arbiter, ostensibly to manage quality. I can forgive them for that, even if I don’t like it. But the only reason to ban blue apps is taste. And if these apps were a matter of taste, why were they approved in the first place? What will the next set of apps be that Apple decides are inappropriate long after people have spent hundreds of hours creating and marketing them? […]

Apple has made a declaration: that sex and sexuality are shameful, even for adults. But only sometimes. And only when people complain.

Unfortunately, they’ve accomplished the opposite. The only thing I’m ashamed of is Apple.

Looks like Joel Johnson was fine when Apple was blocking things he didn’t care about, like open source software and apps he didn’t use; but when they started blocking stuff he cared about, like jiggling boobs, suddenly he started to have second thoughts.

He still doesn’t quite get it, though: He still likes having nanny tell him what he can run on his phone “to manage quality”; he just wants nanny to make only decisions that he agrees with. Good luck with that.