There are two big things that occupy my thoughts now: artificial intelligence and automation of jobs. I see superintelligent ai that is smarter than humans in most ways as likely to be created soon, and it is going to be terrible. And if it's smarter than humans, there will be nothing we can do to stop it. In my view it'll probably take over or kill/enslave humans or something. All I can do is think about the negative outcomes and what does it mean when humans are no longer the smartest creatures on the planet?
I also worry about everyone's jobs being automated out of existence and that we will have nothing to do or derive meaning from. And this also seems imminent.
Is anyone else concerned about any of this?
I also worry about everyone's jobs being automated out of existence and that we will have nothing to do or derive meaning from. And this also seems imminent.
Is anyone else concerned about any of this?