Google is having their big yearly convention, and they demo'd Duplex. I'm kind of blown away. The mix of AI, Machine Learning, and Voice Recognition is pretty fucking incredible. Watch this video for an example: FOCUS: What do you think? Would you use it? How would YOU feel if you were on the receiving end? ALT-FOCUS: The ethics and morality of using robots and AI, and the SkyNet scenario actually coming true... what say you?
In the near-term, this will become routine. Just like how cameras on phones replaced actual cameras, AI will replace actual calls. In the long-term, Elon Musk's own words say it better than I ever could: Eventually we will create something that is smarter than us, and that is very scary.
All it takes one really smart psycho. And despite designing them, machines can already do things we can’t: like lift a hundred metric tonnes like it’s nothing, throw vehicles into space or do half a billion math calculations in less than a second. In a “Terminator World” there would be no John Connor, no revolution, no war. We’d be dead.
A lot of really smart people have started saying the reason we haven't found extraterrestrial life isn't because it's not there, it's that machines have become the dominant life on the planet after the creation of AI, and we're not looking for planets that support machines. Google did a talk where they implemented a machine learning algorithm into the HVAC system on one of their server farms and they saw a decrease of energy use of about 30% compared to when humans controlled it. This sort of optimization will start to spread out into other aspects of our lives, and with the creation of AI the pieces will already be in place for the networks to take over things like power generation, farming, etc. The AI will simply want to protect it's power generation and we'll just get cut out of the equation, go back to being hunter/gatherers, and then simply cease to be kinda like the neanderthals . Don't worry though it's just natural evolution.
Yeah, I love the Terminator movies, but the fundamental plot didnt make sense. Why bother creating a nuclear holocaust when the computer can just develop a germ that kills all human life. Sam Harris talks about this a ton on his podcast. The other scenario is that after the AI becomes intelligent to the point where humans cant even understand it anymore, the AI might be given a task to wipe out all disease and could see the most efficient way to do so is just wiping out all humans. Not exactly malevolent in intention, but just working in terms of pure efficiency.
That what happens when you have no quarter and no emotion. Wiping out an entire species is like tying your shoe.
I'm looking forward to the first "deadly environment" where AI is in charge, and then comes to the realization that humans are a parasite. Back in the 80's we got to see Kirk Douglas' junk when he starred in this movie with Farah Fawcett (the hot version) and Harvey Kaitel: While that was 80's sci-fi fermunda cheese at it's finest, the real possibility exists that Machine Learning and AI systems of some future Mars Colony that is tasked with controlling, say, environmental controls, could perform some optimum efficiency analysis and come to the conclusion that humans are a net loss for the system as a whole, and act accordingly.
Now just imagine those things with motion-targeting sensors, ultra-sonic receivers and guns on every side of them.