Starting today August 7th, 2024, in order to post in the Married Couples, Courting Couples, or Singles forums, you will not be allowed to post if you have your Marital status designated as private. Announcements will be made in the respective forums as well but please note that if yours is currently listed as Private, you will need to submit a ticket in the Support Area to have yours changed.
Here's what was actually said by Yuval Noah Harari..........................................
Go to minute mark ~ 7:30 to 9:00
Sorry I don't know what claims you're referring to.No. The notion you have claimed is conceived by means of a logical deduction which presupposes philosophically believed truths. These are untestable because they are beliefs.
There is nothing in the scientific method which makes the claims you are making.
Of course they are, that was the point! Well, that and getting you to understand the difference between philosophical and methodological naturalism.Nope .. all those references are based on philosophical positions and not on objectively tested evidence.
Honestly I don't know what you're talking about anymore.They are inconsistent arguments when viewed from science's inference based objective method.
This is a crucial matter to understand about science.
Injecting any philosophically held untestable tenets involving the existence of untestable truths, into science, is the most aggregious violation of the principles used in science I can think of.
That's not what was said because what he said is included in the link I gave. You linked to an entirely different lecture of his. What he said is in the link I gave.
Ok. So, he said something similar. Did you bother to listen to the minute and a half that I marked?
Obviously, Yuri isn't saying that an A.I. Bible WILL be made, but he was suggesting that A.I. could write a text that would be considered sacred and accurate. What that would be, I don't know. A "correct" version of the Bible? A new world Scripture above all religious texts?
It's not clear at this point.
Here's what was actually said by Yuval Noah Harari..........................................
Go to minute mark ~ 7:30 to 9:00
Okay ... so he said AI "could" write a bible.
Are you saying that the World Economic Forum didn't call for it to be done?
And did you catch his implication that all religions are wrong, when he said, "In a few years there might even be religions that are correct."
Ok. So, he said something similar. Did you bother to listen to the minute and a half that I marked?
Obviously, Yuri isn't saying that an A.I. Bible WILL be made, but he was suggesting that A.I. could write a text that would be considered sacred and accurate. What that would be, I don't know. A "correct" version of the Bible? A new world Scripture above all religious texts?
It's not clear at this point. But there's an open future possibility.
They have not called for it saying, "Thou Shalt Be Done."Okay ... so he said AI "could" write a bible.
Are you saying that the World Economic Forum didn't call for it to be done?
And did you catch his implication that all religions are wrong, when he said, "In a few years there might even be religions that are correct."
You do realize, do you not, that the IQ of a computer is zero?
As the link I gave before said: NO, it didn't call for it to be done. Not in the slightest.
Okay ... thanks.
They have not called for it saying, "Thou Shalt Be Done."
No. It has only been strongly suggested by Yuri, in the mode of a "warning," that it might be attempted some day, bringing with it problems for humanity, to say the least.
In the event it does happen, would you endorse it?
That's not what was said because what he said is included in the link I gave. You linked to an entirely different lecture of his. What he said is in the link I gave.
Personally, I expect that what Yuri warned about back in 2018 or so is more along the lines of what we have to worry about.
Okay ... thanks.
Same question.
In the event it does happen, would you endorse it?
Why should they, or anyone endorse it? Is your faith so weak that a machine threatens it so easily?
There. I said it. I feel better.
Nope.
I don't even plan to be around then.
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?