Off my trolley problem

People interested in self-driving cars keep discussing the “trolley problem” — that is, in the event of an imminent accident, how should the car decide who gets to live, and who dies? The problem is, ethics is a really tricky area of philosophy. So I have A Modest Proposal: Technology can enable a free market solution instead. Each passenger in a vehicle or pedestrian with a smartphone can place accident compensation funds in escrow, and be issued a digital certificate stating the amount.

On AI and existential risk, continued

Bret Victor expands on something I mentioned in my article on AI: I am generally on the side of the critics of Singulitarianism, but now want to provide a bit of support to these so-called rationalists. At some very meta level, they have the right problem — how do we preserve human interests in a world of vast forces and systems that aren’t really all that interested in us? But they have chosen a fantasy version of the problem, when human interests are being fucked over by actual existing systems right now.

On AI and Existential Risk

[Updated 2015-05-24] Discussions of the existential risk of artificial intelligence mostly center on the possible consequences of intelligence explosion. This is a hypothetical moment where a general purpose AI works out how to reprogram itself to make itself more intelligent. This leads to a feedback loop, and before long the AI is hundreds of times more intelligent than any human. This, it is imagined, leads to disaster for humanity. It’s becoming quite trendy to worry about this scenario, particularly amongst members of the rational community.

A.i.

We finally got to watching A.i.: Artificial Intelligence. We’re probably the last people alive who haven’t seen it, so I trust you will allow me the indulgence of a few spoilers in the course of my criticism. Let’s start with the big issue: the movie has the most egregious deus ex machina ending I have seen in years of movie-watching. It’s so hideous that it could be used as the canonical example when educating future generations of movie makers in what not to do.