Robots Have No Morals Other Than Kill Humans They Can’t Feel Pain Or Right & Wrong. My Terminator Theory.

robo2

So this basically goes hand and hand with my Terminator Theory. A robot has no sense of right and wrong thus leading it to do what it’s naturally needs to do, and that is kill humans! They can’t feel pain or wrong/regret/grief so if they kill a human what stops them from killing the next? They are blood thirsty monsters that we need to find away to punish, or in my opinion we should just throw them in the ocean.

“If you build artificial intelligence but don’t think about its moral sense or create a conscious sense that feels regret for doing something wrong, then technically it is a psychopath,” says Josh Hall, a scientist who wrote the book Beyond AI: Creating the Conscience of a Machine. Accordingly, robo-ethicists want to develop a set of guidelines that could outline how to punish a robot, decide who regulates them and even create a “legal machine language” that could help police the next generation of intelligent automated devices.

Read up more here.

Via: WIE