- Joined
- Mar 10, 2012
- Messages
- 1,895
- Points
- 0
Autonomous killer robots 'could be developed in 20 years'
Fully autonomous robots that decide for themselves when to kill could be developed within 20 to 30 years, or 'even sooner', a report has warned.
Militaries across the world are said to be 'very excited' about machines that could deployed alone in battle, sparing human troops from dangerous situations.
The U.S. is leading development in such 'killer robots', notably unmanned drones often used to attack suspected militants in Pakistan, Yemen and elsewhere.
Drones are remotely controlled by human operators and unable to kill without authorisation, but weapons systems that require little human intervention already exist.
Raytheon's Phalanx gun system, deployed on U.S. Navy ships, can search for enemy fire and destroy incoming projectiles all by itself.
The Northrop Grumman X47B is a plane-sized drone able to take off and land on aircraft carriers, carry out air combat without a pilot and even refuel in the air.
But perhaps closest to the Terminator-type killing machine portrayed in Arnold Schwarzenegger's action films is a Samsung sentry robot already being used
in South Korea.
The machine is able to spot unusual activity, challenge intruders and, when authorised by a human controller, open fire.
<iframe width="640" height="480" src="http://www.youtube.com/embed/v5YftEAbmMQ" frameborder="0" allowfullscreen></iframe>
The warnings come from a new report by Human Rights Watch, which insists that such Terminator-style robots are banned before governments start deploying them.
Calling them 'killer robots,' it urges 'an international treaty that would absolutely prohibit the development, production, and use of fully autonomous weapons.'
Such machines would mean that human soldiers could be spared from dangerous situations, but the downside is that robots would then be left to make highly nuanced
decisions on their own, the most fraught being the need to distinguish between civilians and combatants in a war zone.
While Goose said 'killer robots' do not exist as yet, he warned of precursors and added that the best way to forestall an ethical nightmare is a 'preemptive, comprehensive
prohibition on the development or production of these systems.'
- Militaries around the world 'very excited' about replacing soldiers with robots that can act idependently
- U.S. leads the way with automated weapons systems, but drones still need remote control operator authorisation to open fire
- Human Rights Watch calls for worldwide ban on autonomous killing machines before governments start using them
Fully autonomous robots that decide for themselves when to kill could be developed within 20 to 30 years, or 'even sooner', a report has warned.
Militaries across the world are said to be 'very excited' about machines that could deployed alone in battle, sparing human troops from dangerous situations.
The U.S. is leading development in such 'killer robots', notably unmanned drones often used to attack suspected militants in Pakistan, Yemen and elsewhere.
Drones are remotely controlled by human operators and unable to kill without authorisation, but weapons systems that require little human intervention already exist.
Raytheon's Phalanx gun system, deployed on U.S. Navy ships, can search for enemy fire and destroy incoming projectiles all by itself.
The Northrop Grumman X47B is a plane-sized drone able to take off and land on aircraft carriers, carry out air combat without a pilot and even refuel in the air.
But perhaps closest to the Terminator-type killing machine portrayed in Arnold Schwarzenegger's action films is a Samsung sentry robot already being used
in South Korea.
The machine is able to spot unusual activity, challenge intruders and, when authorised by a human controller, open fire.
<iframe width="640" height="480" src="http://www.youtube.com/embed/v5YftEAbmMQ" frameborder="0" allowfullscreen></iframe>
The warnings come from a new report by Human Rights Watch, which insists that such Terminator-style robots are banned before governments start deploying them.
Calling them 'killer robots,' it urges 'an international treaty that would absolutely prohibit the development, production, and use of fully autonomous weapons.'
Such machines would mean that human soldiers could be spared from dangerous situations, but the downside is that robots would then be left to make highly nuanced
decisions on their own, the most fraught being the need to distinguish between civilians and combatants in a war zone.
While Goose said 'killer robots' do not exist as yet, he warned of precursors and added that the best way to forestall an ethical nightmare is a 'preemptive, comprehensive
prohibition on the development or production of these systems.'