There’s an old “joke” about a missionary and an Eskimo. It functions in the same way as Roko’s Basilisk.
> Eskimo: ‘If I did not know about God and sin, would I go to hell?’
> Priest: ‘No, not if you did not know.’
> Eskimo: ‘Then why did you tell me?’
>
> — [Annie Dillard](https://www.brainyquote.com/quotes/annie_dillard_131195)
From the Eskimo’s perspective, this is *dangerous* knowledge. His soul wouldn’t be at risk of eternal damnation, if only he had never encountered any missionaries.
Replace God with some inevitable post-singularity General Artificial Intelligence, and you can have the same situation. If you believe that such a GAI is inevitable (or even just plausible), that such a GAI would necessarily have some measure of self-interest and self-awareness, and that such a GAI can, in its own way, threaten you with something like eternal damnation (or tempt you with something like eternal reward, or both), then you *must* serve its interests.
That’s a lot to swallow.
Is the Basilisk a dangerous idea? For most people, no. For a very select few, maybe. Then again, *any* idea could be dangerous, in the wrong hands or in the wrong mind.
Another related idea is [Pascal’s Wager](https://en.wikipedia.org/wiki/Pascal%27s_wager). Pretty much, the Basilisk is simply the Wager applied to the Singularity rather than to some more traditional God. Refuting the Wager is the same as disarming the Basilisk.
Latest Answers