Roko's Basilisk is the idea that a benevolent AI could take over the world in the future, and then torture a clone of you unless you donate more money to building the AI now. The idea is absurd on its face, but becomes even more absurd when you learn that it sort of makes sense, given a bunch of beliefs that many LessWrongers have:
- An AI takeover in the future is highly likely, and it will resemble LW predictions (e.g. it will follow their particular brand of utilitarianism, have the ability to clone people).
- If someone clones your state of mind, then you are the clone.
- It is rational to provide incentives for past actions that have already occurred. This all part of Timeless Decision Theory, a utilitarian philosophy based on gazing deeply at Newcomb's Paradox and trying to rigorously justify the one-boxer position.
There are good counterarguments to Roko's Basilisk, even within LessWrong assumptions, but for me it's all moot since I find the AI predictions to be implausible.
I also disagree with the idea that I am my clone, for idiosyncratic reasons.
I believe the me of right now and the me of a minute from now are different people. We are in different space-time locations, we have different brain configurations, why would we be the same person? Yes, clearly we are the same person, falling along the same continuous line, but we're not the same same, we're not identical.
Since I am unquestionably different from the person I was a minute ago, the question is why should I particularly care about this other person? He's not so special, you see. Maybe I shouldn't particularly care about him, maybe I should care about everyone equally. But the fact of the matter is, curse this material body, I care a lot about future me even though he is not me. I would act against the interests of everyone else to favor this one random guy, I really would.
If someone clones my exact state of mind, that clone is not me. Like my future self, the clone would be a lot like me, but still not be identical. But unlike my future self, I don't particularly care about my clone. Why should I? I may care a lot about my future self, but that favoritism is a necessary evil. I see no reason to extend that evil any further to my clone.