• Starting today August 7th, 2024, in order to post in the Married Couples, Courting Couples, or Singles forums, you will not be allowed to post if you have your Marital status designated as private. Announcements will be made in the respective forums as well but please note that if yours is currently listed as Private, you will need to submit a ticket in the Support Area to have yours changed.

Roku's Basilisk: The Most Dangerous Thought Experiment

citizenthom

I'm not sayin'. I'm just sayin'.
Nov 10, 2009
3,299
185
✟20,412.00
Gender
Male
Faith
Non-Denom
Marital Status
Married
Politics
US-Republican
What are you: a Box A-er, or a Box B-er, and why?

"One day, LessWrong user Roko postulated a thought experiment: What if, in the future, a somewhat malevolent AI were to come about and punish those who did not do its bidding? What if there were a way (and I will explain how) for this AI to punish people today who are not helping it come into existence later? In that case, weren’t the readers of LessWrong right then being given the choice of either helping that evil AI come into existence or being condemned to suffer?

You may be a bit confused, but the founder of LessWrong, Eliezer Yudkowsky, was not. He reacted with horror:

Listen to me very closely, you idiot.
YOU DO NOT THINK IN SUFFICIENT DETAIL ABOUT SUPERINTELLIGENCES CONSIDERING WHETHER OR NOT TO BLACKMAIL YOU. THAT IS THE ONLY POSSIBLE THING WHICH GIVES THEM A MOTIVE TO FOLLOW THROUGH ON THE BLACKMAIL.
You have to be really clever to come up with a genuinely dangerous thought. I am disheartened that people can be clever enough to do that and not clever enough to do the obvious thing and KEEP THEIR IDIOT MOUTHS SHUT about it, because it is much more important to sound intelligent when talking to your friends.
This post was STUPID.

Yudkowsky said that Roko had already given nightmares to several LessWrong users and had brought them to the point of breakdown. Yudkowsky ended up deleting the thread completely, thus assuring that Roko’s Basilisk would become the stuff of legend. It was a thought experiment so dangerous that merely thinking about it was hazardous not only to your mental health, but to your very fate."

http://www.slate.com/articles/techn...errifying_thought_experiment_of_all_time.html
 

SkyWriting

The Librarian
Site Supporter
Jan 10, 2010
37,281
8,501
Milwaukee
✟411,038.00
Country
United States
Gender
Male
Faith
Non-Denom
Marital Status
Married
Politics
US-Others
What are you: a Box A-er, or a Box B-er, and why?

"One day, LessWrong user Roko postulated a thought experiment: What if, in the future, a somewhat malevolent AI were to come about and punish those who did not do its bidding? What if there were a way (and I will explain how) for this AI to punish people today who are not helping it come into existence later? In that case, weren’t the readers of LessWrong right then being given the choice of either helping that evil AI come into existence or being condemned to suffer?
You may be a bit confused, but the founder of LessWrong, Eliezer Yudkowsky, was not. He reacted with horror:
Listen to me very closely, you idiot.YOU DO NOT THINK IN SUFFICIENT DETAIL ABOUT SUPERINTELLIGENCES CONSIDERING WHETHER OR NOT TO BLACKMAIL YOU. THAT IS THE ONLY POSSIBLE THING WHICH GIVES THEM A MOTIVE TO FOLLOW THROUGH ON THE BLACKMAIL.You have to be really clever to come up with a genuinely dangerous thought. I am disheartened that people can be clever enough to do that and not clever enough to do the obvious thing and KEEP THEIR IDIOT MOUTHS SHUT about it, because it is much more important to sound intelligent when talking to your friends.
This post was STUPID.Yudkowsky said that Roko had already given nightmares to several LessWrong users and had brought them to the point of breakdown. Yudkowsky ended up deleting the thread completely, thus assuring that Roko’s Basilisk would become the stuff of legend. It was a thought experiment so dangerous that merely thinking about it was hazardous not only to your mental health, but to your very fate."http://www.slate.com/articles/techn...errifying_thought_experiment_of_all_time.html

Meh
 
  • Like
Reactions: Chesterton
Upvote 0

FrumiousBandersnatch

Well-Known Member
Mar 20, 2009
15,405
8,143
✟349,282.00
Faith
Atheist
Totally absurd. Do we see people being punished today for not helping a malevolent AI exist? No.

You can invent whatever situation you like, including one with the exact opposite results, with just as much validity; i.e. none.

I'm inclined to trust the friendly AI that has traveled back in time and already prevented the malevolent AI from ever existing in this timeline :rolleyes:
 
Upvote 0

Ana the Ist

Aggressively serene!
Feb 21, 2012
39,990
12,573
✟487,130.00
Country
United States
Gender
Male
Faith
Atheist
Marital Status
Married
Um, yeah. Someone who thinks cryonics is somehow a valid way of preserving life is probably not the greatest source for scientific thought experiments.

In the future, they will be known as frozen dinners.
 
Upvote 0

Lazarus Short

Well-Known Member
Apr 6, 2016
2,934
3,009
75
Independence, Missouri, USA
✟301,642.00
Gender
Male
Faith
Non-Denom
Marital Status
Married
Actually, if you get meat really cold, freezer burn is minimal. That takes at least liquid nitrogen. I still remember the first time I peered into a deep container of the stuff - it looked like water, but was in a slow boil at -200 degrees...
 
Upvote 0

Nithavela

you're in charge you can do it just get louis
Apr 14, 2007
30,612
22,256
Comb. Pizza Hut and Taco Bell/Jamaica Avenue.
✟587,746.00
Country
Germany
Faith
Other Religion
Marital Status
Single
Sometimes intelligent people can be very stupid.

He should stick to writing Harry Potter fanfiction.

I'm with Box B by the way, because who cares about 1000$ when he'll get a million? And if that computer is somehow wrong for the first time, I can at least rub it into it's developers faces. It's a win-win.

A no-brainer, really.
 
Last edited:
Upvote 0

Dan Bert

Dan
Dec 25, 2015
440
25
71
Cold Lake Alberta
✟18,017.00
Gender
Male
Faith
Marital Status
Married
WIthout God and Ignorance on how things work get people into trouble. First In Eternity there is only Now..which mean the linear existence is just for us on the earth. 2. God is in Charge, not AI's or people that build them. We cannot evolve to the point of perfect morality and ethics without knowing the beginning from the ending. Our lack of ability to see in the future prevent this. Also if you notice....this civilization, is fast losing the "civil" in it. Normally without God the descent into corruption and wickedness increases at an extremely fast pace. The only way out is to let the Spirit of GOD decide for us what is good and evil. This by-passes our own knowledge of God and Evil and wisdom.

dan


What are you: a Box A-er, or a Box B-er, and why?

"One day, LessWrong user Roko postulated a thought experiment: What if, in the future, a somewhat malevolent AI were to come about and punish those who did not do its bidding? What if there were a way (and I will explain how) for this AI to punish people today who are not helping it come into existence later? In that case, weren’t the readers of LessWrong right then being given the choice of either helping that evil AI come into existence or being condemned to suffer?

You may be a bit confused, but the founder of LessWrong, Eliezer Yudkowsky, was not. He reacted with horror:

Listen to me very closely, you idiot.
YOU DO NOT THINK IN SUFFICIENT DETAIL ABOUT SUPERINTELLIGENCES CONSIDERING WHETHER OR NOT TO BLACKMAIL YOU. THAT IS THE ONLY POSSIBLE THING WHICH GIVES THEM A MOTIVE TO FOLLOW THROUGH ON THE BLACKMAIL.
You have to be really clever to come up with a genuinely dangerous thought. I am disheartened that people can be clever enough to do that and not clever enough to do the obvious thing and KEEP THEIR IDIOT MOUTHS SHUT about it, because it is much more important to sound intelligent when talking to your friends.
This post was STUPID.

Yudkowsky said that Roko had already given nightmares to several LessWrong users and had brought them to the point of breakdown. Yudkowsky ended up deleting the thread completely, thus assuring that Roko’s Basilisk would become the stuff of legend. It was a thought experiment so dangerous that merely thinking about it was hazardous not only to your mental health, but to your very fate."

http://www.slate.com/articles/techn...errifying_thought_experiment_of_all_time.html
 
Upvote 0