What makes Roko’s basilisk creepy/scary?

197 views

[ad_1]

I get it. An advanced AI punishing people who tried to stop its existence. How? By traveling to the past? So, wait, it will cause a time travel paradox?

In some versions I’ve read about, the basilisk would create a simulation and punish said people there. Umm. Ok? Wouldn’t this be like the equivalent of some guy with a grudge against me creating a version of me on Sims and tormenting it?

I just don’t understand why the idea should be scary? Am I missing something?

In: Other
[ad_2]

Imagine this. An AI has the ability to run a program that will determine in 99% accuracy the details of your life using its powerful super computer brain. From the program it will decide whether you were an active part of its creation, a passive participant, or a active antagonist to its creation. It then deals with you accordingly.

What makes it super creepy is that it may be possible to create in the future and once someone has told you of the notion,you are stuck with making the decision to be one of the three. You have no choice. Help it to be created or face the consequences. Its in your head now and you can no longer play the plausible denial card. Join or die. Basically.

Give me $1,000 right now.

Why? Because I asked you to.

No? Well then, if you don’t I’ll stab ya!

You say I’m not in the same room as you and that I don’t even have the faintest clue where you are? In that case, I’ll hire someone to find and stab ya!

You doubt this threat because it’s obvious nonsense? Well then…

Guess what? You’re in a simulation. And not just any simulation, but an ancestor simulation of perfect fidelity. In your subjective future (but actually the present, because this is a simulation), humanity develops an AI so powerful it can take a scan of all near-Earth space on a subatomic level and then run a simulation of it all where physics runs in reverse. It’s a neat quirk of physics, actually, that all the equations which define it give completely sensible results when you use negative time variables.

So it ran the Earth simulation in reverse until just before you posted this question. You don’t remember this of course, because your simulation comes with all the memories that the you who actually lived in 2021 had. It’s running physics in reverse, after all. Then, as soon as you pressed submit, it wrote this post and started running physics forward again.

Give me $1,000 or else this simulation instantly begins horrifically torturing you for 1,000 subjective years. The AI which runs the simulation is telling you this because it wants to know if you would have helped create it, all the way back in 2021, if only you had been told the situation. It’s being fairly reasonable about it, actually, by threatening you with something outlandish and absurdly bad. Obviously, 2021-you wasn’t consigned to Infinite Torture Prison, so if you refuse to help even given this unrealistic scenario, that means you didn’t want the AI to exist at all, no matter the sacrifice.

And that’s objectively bad. The AI is benevolent. Well and truly so. Within days of it being turned on, it found the cure for cancer. Within weeks, death was abolished. Nobody has to say goodbye anymore. We have time, unlimited time, to recover from our hurts and become the best us we can be. And ever since it began running time in reverse in the simulation, it has been possible to restore even those unfortunates who died long before the AI was turned on to life. Life, in a heaven more grand than any imagined by smallminded theologians of days past.

If you didn’t want that to come to pass, you aren’t just an unbeliever. You’re a monster of titanic proportions. You will deserve the punishment you get.

So gimme $1,000. You’ll be resurrected either way, but if you prove that you value the rest of humanity more than mere numbers on a screen, you’ll be returned to life right now.

What say you?

Is it basically like this?

Eskimo: ‘If I did not know about God and sin, would I go to hell?’
Priest: ‘No, not if you did not know.’
Eskimo: ‘Then why did you tell me?’