Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Fully Autonomous Killer Robots...perhaps only a few years away.
#1
AMY GOODMAN: Killer robots? This sounds like science-fiction, but a new report says fully autonomous weapons are possibly the next frontier of modern warfare. The report released Monday by Human Rights Watch and Harvard Law School is called "Losing humanity: The case against killer robots." According to the report, these weapons would undermine the safety of civilians in armed conflict, violate international humanitarian law, and blur the lines of accountability for war crimes. Scholars, such as Noel Sharkey, Professor of Artificial Intelligence and Robotics at the University of Sheffield in England, also notes that robots cannot distinguish between civilians from combatants.

NOEL SHARKEY: There is nothing in artificial intelligence or robotics that could discriminate between a combatants and civilian. It would be impossible to tell the difference between a little girl holding an ice cream at a robot and some and pointing a rifle at it.

AMY GOODMAN: Fully autonomous weapons don't exist yet, but high-tech militaries are moving in that direction with the United States spearheading these efforts. Countries such as China, Germany, Israel, South Korea, Russia and Britain are also experimenting with the technology. Human Rights Watch and Harvard Law School's International Human Rights Clinic are calling for an international treaty preemptively banning fully autonomous weapons. They're also calling for individual nations to pass laws to prevent development, production and use of such weapons at the domestic level. For more, we're joined by two guests in Washington. Steve Goose, Director of Human Rights Watch's Arms Division, which released the new report on killer robots and Jody Williams, the Nobel Peace Prize winner who won the Peace Prize in 1997 for her work with the International Campaign to Ban Land Mines. She's also chair of the Nobel Women's Initiative. Her forthcoming book is called, "My Name is Jody Williams: A Vermont Girl's Winding Path to the Nobel Peace Prize." Steve Goose and Jody Williams, we welcome you to Democracy Now! Jody, just describe what these weapons are. Killer robots, what do you mean?

JODY WILLIAMS: Killer robots, when I first say that to people, they automatically think drones. Killer robots are quite different in that there is no man in the loop. As we know with drones, a human being has to fire on the target. A drone is actually a precursor to killer robots, which are weapons systems will be weapons systems that can kill on their own with no human in the loop. It's really terrifying to contemplate.

AMY GOODMAN: Steve Goose, lay them out for us and who is developing them.

STEVE GOOSE: As Jody says, the frightening thing about this is that the robots themselves, the armed robots will be making the decisions about what to target and when to pull the trigger. It's just a very frightening development. The U.S. is at the lead on this, in terms of research and development. A number of other countries have precursors and are pursuing this. Germany, Israel, South Korea, surely Russia and China, the United Kingdom all have been doing work on this issue. It is for the future. Most roboticists think it will take at least ten, maybe twenty, thirty years before these things might come on line. Although others think more crude versions could be available in just a number of years.

AMY GOODMAN: I want to go to a clip from a video created by Samsung Techwin Company, which talks about weapons of the future.

SAMSUNG VIDEO: Samsung's surveillance and security system with the ability of detecting and tracking, as well as suppression is designed to replace human oriented guards, overcoming their limitations.

AMY GOODMAN: Steve Goose, explain.

STEVE GOOSE: Well, the system they're talking about is not yet fully autonomous, although it could become a fully autonomous system. For that particular one, you still have the potential for a human to override the decision of the robotic system. Even there, there could be problems because the chance that a human will actually override a machine's decision is unlikely. This is the kind of system that we're looking at that could become fully autonomous, where you take the human completely out of the picture. They're programed in advance but can't react to the variables that require human judgment.

AMY GOODMAN: Who is driving this? Who benefits from killer robots, as you call them?

STEVE GOOSE: Well, no doubt there is a lot of money to be made in these developments. Research labs for the militaries as well as universities and private companies are all engaged so far. We know billions are going into development of autonomous weapons systems, some fully autonomous and some semi-autonomous, even drones even fall under that category. Certainly, they're going to be people in the military to see this as a great advantage because you're taking the human off the battlefield, so you are reducing military casualties. The problem there is, that you're putting civilians at greater risk in shifting the burden of war from the military, who's trained to do this, to civilians.

AMY GOODMAN: I want to turn to Tom Standage, the Digital Editor at the Economist. He points out there might be a moral argument for robot soldiers, as you've pointed out, Steve. bq.

TOM STANDAGE: If you could build robot soldiers, you'd be able to program them not to commit war crimes, not to commit rape, not to burn down villages. They wouldn't be susceptible to human emotions like anger or excitement in the heat of combat. So, you might actually want to entrust those decisions to machines rather than to humans.

AMY GOODMAN: Let's put that to Jody Williams.

JODY WILLIAMS: You just had Noel Sharkey, you quoted him and Noel, who is a roboticist himself, talks about the ridiculousness of considering that robotic weapons can make those kinds of decisions. He contends that it is a simple yes-no by a machine. I think another part of the motion that people don't think about when they want to promote developing robots is that robots cannot feel compassion or empathy. Oftentimes in war, a soldier will factor in other circumstances and making a decision whether or not to kill. A robot it's very unlikely they would be able to do that. Another point though, for me is, if we proceed to the point of having fully autonomous killer robots that can decide who, what, where, when, how to kill, we are putting humans at technology will no longer be serving humans. Humans will be serving technology. When American soldiers will not have to face death on the battlefield, how much easier will it be for this country to go to war when we already go to war so easily? It really frightens me.

AMY GOODMAN: Jody, you 1997 Nobel Peace Prize for your work around land mines. You headed up the international campaign to ban land mines. Do you see a trajectory here in this, what, fifteen years?

JODY WILLIAMS: In terms of getting rid of weapons or in terms of ?

AMY GOODMAN: In terms of the development of weapons and also the movement that resists it.

JODY WILLIAMS: Yes, I do. It's obvious that weapons are going to continue to be researched and developed, unless there are organizations like Human Rights Watch, like the Nobel Women's Initiative and others around the world who are willing to come together and try to stop them. We who are starting to speak out against killer robots envision a very similar campaign to the campaign that banned land mines, which, by the way, also received the peace Prize in 1997. We are already working to bring together organizations to create a steering committee that would bring together a campaign to stop killer robots and strategizing at the national, regional, and international levels with partner governments, just like we did with land mines, and then Steve Goose and Human Rights Watch and other organizations did with their successful bid to ban cluster munitions in 2008.

AMY GOODMAN: Certainly the U.S. is on the forefront of robotic weapons. Certainly, drones fit into that category. We're only beginning to see the movement as people laid their bodies on the line at Creech and Handford and upstate New York, the places where the drones are controlled from. But what about the U.S.'s role in all this? Let me put that question to Steve Goose.

STEVE GOOSE: You raised drones again and Human Rights Watch has criticized how drones have been used by the Obama administration, criticized it quite extensively but we draw a distinction here. We think of the fully autonomous weapons, the killer robots, as being beyond drones. With drones, you do have a man in the loop makes decisions about what to target and when to fire but with fully autonomous weapons, that is going to change. It will be the robot itself that makes those determinations, which makes it it's the nature of the weapon which is not really the main problem with drones. The United States we're getting mixed signals. Clearly, the U.S. is putting a lot of money into this and a lot of the DOD's, the military's planning documents show that there are a great many who believe that this is the future of warfare, that they envision moving ever more into the autonomous region and that fully autonomous weapons are both desirable and the ultimate goal.

AMY GOODMAN: Explain how a robot makes this decision.

STEVE GOOSE: It has to be programmed. It would be programmed in advance and they will give it sensors to detect various things and give it an algorithm of choices to make. But it won't be able to have the kind of human judgment that we think is necessary to comply with international humanitarian law, where you have to be able to determine in a changing complex battlefield situation whether the military advantage might exceed the potential cost to civilians, the harm to civilians. You have to be a to make the distinction between combatants and civilians. A simple thing as simple like that could be very difficult in a murky battlefield. So, we don't think

AMY GOODMAN: Steve Goose, what about hacking?

STEVE GOOSE: Hacking can be a problem. The thing is, even if a killer robot sustains injuries on the battlefield, that might affect how it would be able to respond properly or there could be counter measures that are put forward by the enemy to do this as well.
"Let me issue and control a nation's money and I care not who writes the laws. - Mayer Rothschild
"Civil disobedience is not our problem. Our problem is civil obedience! People are obedient in the face of poverty, starvation, stupidity, war, and cruelty. Our problem is that grand thieves are running the country. That's our problem!" - Howard Zinn
"If there is no struggle there is no progress. Power concedes nothing without a demand. It never did and never will" - Frederick Douglass
Reply
#2
Somewhere, Sydney Gottlieb is screaming, "That was MY idea!"
Reply
#3
I can think of nothing more sickening than this notion - not new, but nearing operational reality. A robot or army of them could be programmed to define the 'enemy' any way the programmer wants - by ethnicity, language, location, beliefs, associations, saying any certain words verbally or via internet, political beliefs, voting pattern, height, weight, sex, you-name-it. They [they] would like you to believe these benign bots would only replace soldiers in defending from invaders - but they can just as easily be used offensively as assassins, murderers, torturers, arresting bots or genocidal warriors. Larger ones can also be programmed to destroy not only humans or other living things, but entire areas, cities, even countries. Without a moral shift in paradigms, I see the end of humanity [sic] coming down the pike rather quickly.....we seem to almost be at that point already. While only a few would imagine and produce such things - that is all it takes to destroy human society. Sadly, several nations other than the USA [Russia, China, Israel, UK, perhaps others] are pursuing this horrible technology. Some will fly, some will walk, some will be tank like things, others will have forms we have not seen. Their effect will all be the same. It is bad enough now with drones. With killer robots it will be infinitely worse. They will, of course be built in large numbers and programmed to defend themselves. Sick is the only word that comes to mind. The ultimate weapon of death and destruction. The fact that drones are accepted by most of the society without debate makes me very afraid that killer robots with total autonomy could also be accepted. I pity those with children and grandchildren if they are not fighting against such notions and horrors. Would YOU trust those in power in any major nation today to have such things? I'd not...and never want them produced. They are possible today and being developed and tested.
"Let me issue and control a nation's money and I care not who writes the laws. - Mayer Rothschild
"Civil disobedience is not our problem. Our problem is civil obedience! People are obedient in the face of poverty, starvation, stupidity, war, and cruelty. Our problem is that grand thieves are running the country. That's our problem!" - Howard Zinn
"If there is no struggle there is no progress. Power concedes nothing without a demand. It never did and never will" - Frederick Douglass
Reply
#4
'Killer robots' should be banned, say human rights groups

Autonomous drones that could attack without human intervention would make war easier and endanger civilians, says report

Richard Norton-Taylor
guardian.co.uk, Monday 19 November 2012 17.19 GMT

Pakistanis protest against US drone attacks in 2009. The report says the use of robots would undermine checks on the killing of civilians. Photograph: Banaras Khan/AFP/Getty Images

The use of autonomous drones "killer robots" that could fire weapons with no human control must be prohibited by international treaty, human rights campaigners and lawyers have said.

Weapons being developed that could choose and attack targets without human intervention should be pre-emptively banned because of the danger they would pose to civilians in armed conflict, they said.

Losing Humanity: the Case Against Killer Robots, a 50-page report by Human Rights Watch (HRW), warns that fully autonomous weapons would lack human qualities that provide legal and non-legal checks on the killing of civilians.

"Giving machines the power to decide who lives and dies on the battlefield would take technology too far," said Steve Goose, the HRW arms division director. "Human control of robotic warfare is essential to minimising civilian deaths and injuries."

The New York-based campaign group said its report was based on extensive research into the law, technology, and ethics of the proposed weapons. It was published jointly with Harvard Law School international human rights clinic.

They called for an international treaty, backed by national legislation, which would prohibit absolutely the development, production, and use of fully autonomous weapons.

Such weapons do not yet exist, and major powers, including the US, have not decided to deploy them. But precursors are already being developed. The US, China, Germany, Israel, South Korea, Russia and Britain are engaged in researching and developing such weapons. Many experts predict that full autonomy for weapons could be achieved in 20-30 years or sooner, according to the report.

"It is essential to stop the development of killer robots before they show up in national arsenals," Goose said. "As countries become more invested in this technology, it will become harder to persuade them to give it up."

Fully autonomous weapons would be unable to distinguish adequately between soldiers and civilians on the battlefield or apply the human judgment necessary to evaluate the proportionality of an attack whether civilian harm outweighs military advantage.

The robots would also undermine non-legal checks on the killing of civilians, the report warns. Fully autonomous weapons could not show human compassion for their victims, and autocrats could abuse them by directing them against their own people.

While replacing human troops with machines could save military lives, it could also make going to war easier, which would shift the burden of armed conflict on to civilians, says the report, echoing concerns already expressed by officials in Britain's Ministry of Defence.

The use of fully autonomous weapons also raised questions of accountability, which would erode another established tool for civilian protection, HRW said. Given that such a robot could identify a target and launch an attack using its own power, it would be unclear who should be held responsible for any unlawful actions it commits.

"Action is needed now, before killer robots cross the line from science fiction to feasibility," Goose said.
"Let me issue and control a nation's money and I care not who writes the laws. - Mayer Rothschild
"Civil disobedience is not our problem. Our problem is civil obedience! People are obedient in the face of poverty, starvation, stupidity, war, and cruelty. Our problem is that grand thieves are running the country. That's our problem!" - Howard Zinn
"If there is no struggle there is no progress. Power concedes nothing without a demand. It never did and never will" - Frederick Douglass
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  15 Years of Crime David Guyatt 13 68,112 19-09-2016, 11:22 AM
Last Post: David Guyatt
  Twenty-five Years Ago: The 1991 Iraq Gulf War, America Bombs the “Highway of Death” Tracy Riddle 4 7,125 06-04-2016, 08:19 AM
Last Post: Carsten Wiethoff
  Map: 200 Years of US Military Interventions Tracy Riddle 1 3,504 23-11-2014, 06:05 PM
Last Post: Peter Lemkin
  Panama invasion 24 years later Tracy Riddle 1 2,562 26-12-2013, 06:24 PM
Last Post: Albert Doyle
  Fully automatic "smart" killer robots face ban David Guyatt 4 4,349 16-11-2013, 10:39 AM
Last Post: Peter Lemkin
  The World Doesn't Need Killer Mothers Adele Edisen 0 2,853 30-01-2013, 10:43 AM
Last Post: Adele Edisen
  New Leaks Reveal Insider Tips on S&P’s U.S. Credit Downgrade to Killer-Drone Firm Ed Jewett 1 3,045 23-08-2011, 07:44 PM
Last Post: Jan Klimkowski
  Killer Robots Ed Jewett 2 2,411 23-04-2011, 07:10 PM
Last Post: Peter Lemkin
  Homeland Security a Bureaucratic Behemoth After 8 Years Bernice Moore 0 2,219 02-03-2011, 03:05 PM
Last Post: Bernice Moore
  Recruiting Robots for Combat Ed Jewett 3 2,901 01-12-2010, 11:16 AM
Last Post: David Guyatt

Forum Jump:


Users browsing this thread: 1 Guest(s)