Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Fully automatic "smart" killer robots face ban
#1
And rightly so.

Shades of The Terminator or what...

Quote:'Killer Robots' could be outlawed

'Killer Robots' could be made illegal if campaigners in Geneva succeed in persuading a UN committee, meeting on Thursday and Friday, to open an investigation into their development

[Image: TARANIS_2733423b.jpg]BAE Systems' Taranis, a semi-autonomous unmanned warplane, that will use stealth technology and can fly intercontinental missions and attack both aerial and ground targets Photo: HANDOUT


[Image: Harriet-Alexander__1816135j.jpg]
By Harriet Alexander

7:00AM GMT 14 Nov 2013

The first steps towards the outlawing of "killer robots" could be taken on Thursday, as a UN committee meets to decide whether to investigate banning the controversial technology.

Campaigners are hoping that representatives from 117 states gathering for a two-day annual meeting in Geneva will agree to an inquiry into the development of the machines, which they say pose a serious threat to the world.

"People initially accused us of being in some kind of fantasy world," said Noel Sharkey, professor of artificial intelligence and robotics at Sheffield University, and one of the founders of the Stop the Killer Robots coalition. "But now they have realised that significant developments are already under way.

"At the moment we already have drones, which are supervised by humans I have a lot of issues with these, but they can be used in compliance with international law.

[Image: x-47b_2733409c.jpg][SUP]The X-47B (GETTY IMAGES)[/SUP]

Related Articles



"What we are talking about however is fully-automated machines that can select targets and kill them without any human intervention. And that is something we should all be very worried about."
The UN Convention on Certain Conventional Weapons (CCW) brings together representatives to discuss issues such as the use of chemical gases and landmines.
France is currently chair of the organisation, and campaigners are hopeful that Ambassador Jean-Hughes Simon-Michel, chairman of the CCW, will persuade delegates to support an inquiry. Just one veto to the proposal, however, would prevent it being discussed.
No country has admitted to developing this kind of technology although Oliver Sprague, Amnesty International UK's Arms Programme Director, said that Britain, the US and Israel were the countries thought to be furthest down the road of development.
"We are not talking about Terminator-style robots," said Mr Sprague. "It is most likely to be a drone or something even more mundane, like a row of computer banks that look through the data, find the target and then call in the order for an attack.
"The UK has said that we would never develop systems that operate without a level of human control. But what does that mean? It could be as little as someone keeping a vague eye on a series of computer monitors."
The campaigners maintain that there is a well-founded fear that computer-controlled devices could "go rogue" or be hacked, jammed or copied by terrorists. They also say that we should not hand decisions over whether something is right and wrong to machines.
[Image: iron-dome_2733434c.jpg][SUP]Israel's Iron Dome system (GETTY IMAGES[/SUP]
Professor Sharkey and his team point to the British development of BAE Systems' Taranis, which was tested last month a semi-autonomous unmanned warplane, that will use stealth technology and can fly intercontinental missions and attack both aerial and ground targets.
Named after the Celtic god of thunder, Taranis will follow a set flight path using on-board computers to perform manoeuvres, avoid threats and identify targets. Only when it needs to attack a target will it seek authorisation from a human controller and experts fear that this human authorisation could eventually be dispensed with.
South Korea has invested in developing a robot the SGR1 equipped with machine guns and grenade launchers, which it has tested to guard its border with North Korea. The robot uses infrared sensors to detect targets from up to two miles away and shoot them although the manufacturers, Samsung Techwin, say that it remains under human control.
"The robots are not being deployed to replace or free up human soldiers," said Huh Kwang-hak, a spokesman for Samsung Techwin. "Rather, they will become part of the defence team with our human soldiers. Human soldiers can easily fall asleep or allow for the depreciation of their concentration over time.
"But these robots have automatic surveillance, which doesn't leave room for anything resembling human laziness."
To illustrate their point, Samsung Techwin produced a Hollywood-style six minute advert for the technology, featuring invading "badies" clad in face paint being identified by the SGR1 and halted by the voice of operators working from the base.
The United States navy, meanwhile, have since the 1980s used a radar-controlled gun system on its ships which campaigners say could become totally automated.
Raytheon, the manufacturers, describe the Phalanx Close-In Weapon Systemas "a self-contained package" which "automatically carries out functions usually performed by multiple systems including search, detection, threat evaluation, tracking, engagement, and kill assessment."
[Image: close-in_2733410c.jpg][SUP]The Phalanx Close-In Weapon System (ALAMY)[/SUP]
Last month more than 270 engineers, computer experts and robotics scientists from 37 countries signed a statement demanding the ban of the further development of automated systems on such robots.
They said in a joint statement that given the limitations and unknown future risks of autonomous robot weapons technology, their development and deployment should be made illegal.
"Decisions about the application of violent force must not be delegated to machines," they said.
And in April, the All Party Parliamentary Group on Weapons and Protection of Civilians heard a briefing from the campaigners.
Admiral Lord West, who was at the briefing, said: "I find the idea of artificial intelligence doing targeting and weapon delivery quite abhorrent, and I believe we need to do something to make that illegal globally. I think it is extremely dangerous."
A Ministry of Defence spokesperson said: "The MoD has absolutely no intention of developing any weapons systems that do not require any human involvement.
"Remotely-piloted aircraft systems used to protect ground troops in Afghanistan are all controlled by highly-trained military pilots and the rules of engagement are the same as for conventional weapons. There are no plans to replace skilled military personnel with fully autonomous systems."
If France succeeds in getting an agreement in the meeting this week, a group of experts will be convened to assess the technology and report back on their recommendations. Campaigners are hopeful that the technology could be banned in a similar way to the 1995 CCW prohibition of lasers which can be used to blind people.
"We are a way off a total ban, but this would be a vital first step," said Professor Sharkey. "There are a lot of tripwires to be stepped over first and obstacles to be navigated first.
"We are not calling for the ban of all robots in the military. We just want to see the end to those which, without any human intervention whatsoever, can choose who lives or dies."



The shadow is a moral problem that challenges the whole ego-personality, for no one can become conscious of the shadow without considerable moral effort. To become conscious of it involves recognizing the dark aspects of the personality as present and real. This act is the essential condition for any kind of self-knowledge.
Carl Jung - Aion (1951). CW 9, Part II: P.14
Reply
#2
The scary thing for me is that once something can be done, technologically speaking, it will happen regardless of what we think about it. The PRISM surveillance revelations being a case in point. The predator zone is a moral half-way house to killer robots. The UN can make treaties banning them but will the US sign up? I doubt it. We really are heading full steam ahead into a cheap B-movie sci-fi dystopia.
Reply
#3
More of the same, but in more detail.

Imagine the concept: "death by altorithm".

Quote:

Robowar: The next generation of warfare revealed - a general's dream, but are they also humanity's nightmare?




[Image: 16.navyplane.GT2.jpg]














The armed forces of the West are close to perfecting a new generation of killing machines: autonomous robots that know neither pity nor fear


CAHAL MILMO [Image: plus.png]


Friday 15 November 2013
Rather like a dog with a rubber bone, the Crusher likes to toy with its prey. After first sizing it up, it leaps, rolling and gripping its target until it is sufficiently chewed up and "dead". Unlike a dog, the Crusher is capable of performing this feat on a line of parked cars.




More disturbingly, this six-tonne, six-wheeled monster developed for America's Department of Defence and otherwise known as the Unmanned Ground Combat Vehicle, or UGV, can pounce without the intervention or say-so of a human operator. It is an ability which, in theory, can stretch to firing the machine-gun mounted on its roof.
The UGV is a forerunner of what many in the defence world believe is the next quantum leap in warfare a generation of fully autonomous weapons which would be capable of crossing one of the great Rubicons of modern conflict by "deciding" for themselves when to take human life. In the words of one US general, they are the harbingers of an age where "death by algorithm is the ultimate indignity".
But as a result of a decision taken in a Geneva conference room today, Crusher and its cousins dubbed "killer robots" could now be equally on the road to extinction after a vote by the United Nations Convention on Conventional Weapons (CWC) which would pave the way to a global ban on autonomous weapons.
The unanimous vote means a multi-nation assessment of the technology will now begin, with the aim of yielding a "pre-emptive" prohibition before prototypes such as the Crusher become fully authorised weapons rolling off production lines from Texas to Beijing.
The decision by the CWC was greeted with relief by the Campaign to Stop Killer Robots, a coalition of human rights groups and campaigners who argue that there is only a narrow window before the world's competing powers are sucked into an arms race to produce machines capable, quite literally, of outgunning each other.
Mary Wareham, of Human Rights Watch, said: "This is small but important step on a ladder which we are saying to governments we want them to climb and create a treaty."
Robotic military systems with varying degrees of lethality are under development and in some cases already deployed by the US, South Korea, China, Israel and the UK, as defence budgets around the world respond to the forces of austerity which demand greater capability for less money.
As one might expect with its vast defence budget, the US leads the sector and has advanced programmes to develop not only land-based robots like the Crusher but also the next generation of airborne drones such as the X47-B, a futuristic bat-shaped aircraft with far greater abilities to "fly" itself than the Reapers and Predators used to pick off terrorist leaders in Yemen and Pakistan.
But other countries have already gone further in finding practical uses for robotic weaponry. South Korea and Israel have deployed armed sentries on their disputed borders with North Korea and the Palestinian territories, respectively.
Both systems arrays of sensors, loudspeakers and guns capable of delivering a lethal shot over two miles have a mode to automatically fire on an intruder, although each country insists the option to attack remains for now under direct human control.
Professor Noel Sharkey, the eminent roboticist at Sheffield University and co-founder of the International Committee on Robot Arms Control, told The Independent: "There was once a time when the world recognised the dangers and immorality of the aerial bombardment of cities. Shortly after that, the Second World War broke out and we all know what that resulted in.
"We must not allow the same tit-for-tat process to start with robotic weaponry. There is an absolute red line here which is that a machine must never be delegated the decision to kill a human."
He added: "There are such machines out there, but they are very far from being able to correctly discern the point at which to apply lethal force. We have been working on artificial intelligence since the 1950s but the difficulties are immense. A machine might be able to tell the difference between a ship and a tank, but it may well struggle to tell the difference between a tank and a civilian lorry with a plank of wood sticking out of it."
Professor Sharkey, who said the oft-cited Hollywood example of killer robots in the Terminator film series was "unhelpful" in explaining the reality of the technology, said thought also had to be given to the risk of the weaponry falling into the hands of totalitarian regimes or terrorists.
Proponents of the technology argue that, if properly fettered by software so advanced that it could tell the difference, for example, between a large child with a toy gun and a small adult with an AK-47, it could have a role in future wars. A robot cannot rape, nor can it be motivated by cruelty or vengeance, and it can crunch data to avoid civilian casualties at a rate no human could compete with or so the argument goes.
But opponents say the delivery of death by a machine violates the first law of robotics as laid out in 1942 by the science fiction writer Isaac Asimov that a robot's primary duty is to protect humans and even within the military, there are concerns that such scenarios cross a fundamental boundary.
A former US Air Force general made an impassioned plea earlier this year for action on a treaty to ban the killer machines. Major General Robert Latiff wrote: "Ceding godlike powers to robots reduces human beings to things with no more intrinsic value than any object. When robots rule warfare, utterly without empathy or compassion, humans retain less intrinsic worth than a toaster which at least can be used for spare parts."
Governments have not been deaf to such qualms. Britain's Ministry of Defence, which is developing a "super-drone", has acknowledged that autonomous weapons meeting legal requirements are theoretically possible, but says the development of such systems would be expensive and difficult.
The US Defence Department issued a directive last year requiring that the decision to deploy lethal force must always remain with a human. But Human Rights Watch warned: "The policy of self-restraint [the directive] embraces may also be hard to sustain if other nations begin to deploy fully autonomous weapons systems."
Killing machines: Next generation of warfare
Crusher
Developed for the research arm of the Pentagon, the Crusher is an autonomous robotic armoured vehicle capable of picking its way across a battlefield using an array of sensors and crushing parked civilian cars. During tests in Texas it was fitted with a machine gun, but the vehicle remains a prototype.
X47-B
Billed as the answer to American generals' dreams of a generation of "super-drones" capable of being launched from aircraft carriers, this bat-like jet is far more autonomous than the current crop of pilotless vehicles being used in Afghanistan and Pakistan. Although prototypes are unarmed, it is capable of carrying weaponry.
Taranis
Named after the Celtic god of thunder, Taranis is the British answer to the X47-B and presages an age when combat aircraft will be pilotless. Taranis, which is built by BAE Systems, made its maiden flight in Australia last month and has been described as being capable of "full autonomy".
"Invisible Sword" and SKAT
Little is known about Chinese and Russian research into "killer robots" other than that it is going on. Both countries have unveiled pilotless military jets similar to those being developed by Britain and the US. The Chinese prototype called Anjian or "Invisible Sword" is considered to be an air-to-air combat plane.
SGR-1
Developed by South Korea to watch the border with North Korea, this fixed robot sentry is capable of shooting without human command. Its sensors can detect a human from as far away as two miles and it can fire a machine gun or a grenade launcher. Israel has deployed similar technology, but both countries insist the robots will only fire after being given human orders.
Guardium
An Israeli robotic armed vehicle, it has been developed to patrol borders and sensitive sites such as airports. A promotional video describes a scenario where the vehicle automatically transmits co-ordinates for a missile strike to a pilotless drone.



The shadow is a moral problem that challenges the whole ego-personality, for no one can become conscious of the shadow without considerable moral effort. To become conscious of it involves recognizing the dark aspects of the personality as present and real. This act is the essential condition for any kind of self-knowledge.
Carl Jung - Aion (1951). CW 9, Part II: P.14
Reply
#4
I wonder if they could be hacked to turn against their owners.
Reply
#5
The USA may be the only nation to have the big, bad one's now...but that 'killer-drone gap' is expected to be closed by MANY nations in just a year or two!....then will be the nightmare scenario.......::face.palm:::Nazis:::knight:: just waiting to happen. The one with the stealth technology and that Dave uploaded a photo of can carry cameras, lasers, missiles, but also can carry nukes!...but they didn't want to tell you that. Hmmm....wonder why.
"Let me issue and control a nation's money and I care not who writes the laws. - Mayer Rothschild
"Civil disobedience is not our problem. Our problem is civil obedience! People are obedient in the face of poverty, starvation, stupidity, war, and cruelty. Our problem is that grand thieves are running the country. That's our problem!" - Howard Zinn
"If there is no struggle there is no progress. Power concedes nothing without a demand. It never did and never will" - Frederick Douglass
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  The Face of the Future: Hillary and Permanent War for Permanent Hegemony David Guyatt 0 4,983 17-06-2016, 12:22 PM
Last Post: David Guyatt
  Oh Boy! Automated & Automatic Warfare / Robot Policing - Soon! Peter Lemkin 5 4,951 27-11-2014, 10:40 AM
Last Post: Charles Keeble
  The World Doesn't Need Killer Mothers Adele Edisen 0 3,021 30-01-2013, 10:43 AM
Last Post: Adele Edisen
  Fully Autonomous Killer Robots...perhaps only a few years away. Peter Lemkin 3 4,105 21-11-2012, 08:52 AM
Last Post: Peter Lemkin
  Jews Face Special Risks, Napolitano Says Ed Jewett 1 2,893 13-07-2012, 04:54 AM
Last Post: Magda Hassan
  New Leaks Reveal Insider Tips on S&P’s U.S. Credit Downgrade to Killer-Drone Firm Ed Jewett 1 3,251 23-08-2011, 07:44 PM
Last Post: Jan Klimkowski
  Killer Robots Ed Jewett 2 2,676 23-04-2011, 07:10 PM
Last Post: Peter Lemkin
  radio-controlled smart bullets Ed Jewett 4 4,159 01-12-2010, 03:40 PM
Last Post: Peter Lemkin
  Recruiting Robots for Combat Ed Jewett 3 3,232 01-12-2010, 11:16 AM
Last Post: David Guyatt
  Special ops the new face of war. 0 371 Less than 1 minute ago
Last Post:

Forum Jump:


Users browsing this thread: 1 Guest(s)