• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Elon Musk leads 116 experts calling for outright ban of killer robots

Ironically, I feel like Killer Robots would be the key to world peace.

It's like nuclear weapons. They're deadly but no country will invade the other if it means total destruction.

With killer robots, the playing field between big and small countries is equal. Just keep throwing robots at each other until one side depletes their resources. It would also mean no loss of human life because robots are valued more.

Yes, because Nukes have ever brought peace to the world.

Arms races are the best way to destroy humanity.
 

Razorback

Member
Yeah, no killer robots is probably for the best. My initial instinct was that removing the human element from the carnage could only be a good thing. But that also makes the decision to go to war much easier if you don't have to worry about your own casualties and a public fallout from that. Cheap disposable soldiers could lead to a lot of trigger happy generals. And not just from current superpowers, eventually these technologies will make their way to poor war-torn countries. Rich African warlords could get their hands on swarms of fist-sized explosive kamikaze drones that can target specific ethnic groups and start new genocides without having to put any skin in the game. Or something like ISIS might start flying these things into peoples homes.

I hadn't really thought much about this issue, but now thinking up these scenarios I realized there's really nothing we can do. We already have drones and they're getting better and cheaper every year. It's only a matter of time until they realize they can strap a bomb on them.

: \
 

SomTervo

Member
I wasn't too fussed until I read their statement, and yeah... this stuff could go very, very badly.

Learned that possible lesson from Horizon.

Yeah I bet the writers for Horizon are feeling pretty chuffed with themselves right now.
 
giphy.gif
 

Ishan

Junior Member
i can see this happening given the biological and chemical weapons bans ... also above poster you have no idea.
 
i can see this happening given the biological and chemical weapons bans ... also above poster you have no idea.

It wouldn't make genocide suddenly politically acceptable.
The only difference that robots will destroy other robots instead of humans killing other humans.
 

Hermii

Member
Since when has humans not developed a weapon system because of ethics. I don't see this being the first time it happens in history.
 
It wouldn't make genocide suddenly politically acceptable.
The only difference that robots will destroy other robots instead of humans killing other humans.

This assumes that everyone receives the same number and quality of killer AI simultaneously, and that they are gracious enough to turn them off when they're done cleaning up the other side's AI.

We live in a world where despots cluster bomb and gas their own people, and terrorists detonate bombs in crowded markets, and you think we'll suddenly play fair with weapons that remove personal/personnel risk and largely abstract responsibility? Musk and the others are making this warning now, because once the technology is out there, the worst case scenarios will happen. Bad people will get their hands on it, and you can't stuff it away once its been unleashed on the world.
 
If there is ever any kind of ban on killer robots, I imagine that countries will just abuse loopholes and semantics to justify using killer robot equivalents.
 

pronk420

Member
Since when is Elon Musk an expert on AI?

I agree that this would be a bad idea but because we don't properly understand how some types of neural network are good at certain tasks but can be very easily tricked into getting something completely wrong, not because of some stupid idea that these robots will become sentient and take over the world.
 
Since when is Elon Musk an expert on AI?

I agree that this would be a bad idea but because we don't properly understand how some types of neural network are good at certain tasks but can be very easily tricked into getting something completely wrong, not because of some stupid idea that these robots will become sentient and take over the world.

That's not what he's saying at all in the OP. I mean, you don't need a Terminator scenario to see why having hackable AI drones just hanging around and operational is a potentially bad thing.
 

eot

Banned

  • Killer robots are obviously a shit idea
  • Of course they will be built

It's in our nature to destroy ourselves
 

MogCakes

Member
None of the big 3 powers will follow the ban. Cyber warfare will only intensify after autononous weapons and vehicles become standard, too. The world is going to be a very uneasy place to live in the future.
 
Since when is Elon Musk an expert on AI?

I agree that this would be a bad idea but because we don't properly understand how some types of neural network are good at certain tasks but can be very easily tricked into getting something completely wrong, not because of some stupid idea that these robots will become sentient and take over the world.

One of the founders of open A.I.

Echoing others, it's inevitable unfortunately, and it will be shit if used by terrorists in the future.
 
Weird seeing so many making jokes about this. Killer robots aren't some goofy sci-fi hypothetical. The tech is already here.

And the defeatism in this thread also annoys me. Okay, so it's a near inevitability, so what? We don't even try to change course?
So what's your plan Mango? How do you propose we change the inevitable march of progress?

Jokes are entertaining. Don't attack jokes.

The world is going to be a very uneasy place to live in the future.
I wonder how many people have thought this.
 
Top Bottom