2010-10-21

Stop the war bots before they decide to kill us all, Australian bioethicist warns

Vatic Note: Is this something we have to worry about now or in the future? Well, this link was included on the page of the article to show how this could well be a danger in the hands of psychos or those who are proven to be unreliable in the ethics dept. Then this link shows that they are already being put to use in a limited capacity. However, the drones that are robotic, are a different story, as we have read numerous times. They have erred many times, been hacked by the enemy and ended up killing civilians and school houses full of children. Another good point made in the arguement is "Diplomacy by other means" will become the choice of first resort instead of last resort, why? (Or I should say in the past when we had humans running our gov instead of robots: another good argument for shelving them)?

Because war would then be too easy with no risk for the offender, aggressor. It could then easily move to first choice for profit reasons, revenge reasons, or power reasons.  These decisions should be made now before it becomes too late and too many civilians are harmed or die as a result of lack of such decisions.   Remember, we are already faced with serious violations of war ethics as it is now when costs are high.   The only reason this time in our history that war ethics no longer apply is because those leading the effort have no value for human life and see those fighting as nothing more than robots, cattle, expendable fodder for their war/profit marches to controlling resources.  In otherwords, psychos will do the same whether robots are an issue or not. 

Stop the war bots before they decide to kill us all, Australian bioethicist warns
http://www.news.com.au/technology/australian-bioethicist-warns-of-robopocalypse/story-e6frfro0-1225934861279
By Helen Davidson, From: news.com.au, October 06, 2010
Provided to Vatic Project by Gypsy Flame, Australia

The International Committee for Robot Arms Control hopes autonomous war robots stay on the big screen.

AN Australian lecturer has warned of dangers to humanity if we continue further developing military robots.

Dr Robert Sparrow, senior lecturer for the Centre for Human Bioethics at Monash University, says that unmanned weapons systems encourage war and can give the "illusion of a god-like power".

Dr Sparrow is part of the International Committee for Robot Arms Control (ICRAC), a group dedicated to halting the development of robot weapons.

Their online mission statement states: "Machines should not be allowed to make the decision to kill people."

The committee held an "Expert Workshop" conference last month in Berlin to discuss the issues surrounding armed tele-operated and autonomous robot systems.

It's headed by controversial British scientist Professor Noel Sharkey, a professor of Artificial Intelligence, Robotics and Public Engagement at Sheffield University.

Professor Sharkey has previously written about the dangers of autonomous war robots and their increasing decision-making capabilities, imagining "a little girl being zapped because she points her ice cream at a robot to share".

A press release on the ICRAC website says the meeting was attended by government officials, representatives of human rights organisations, arms control experts, philosophers, scientists and engineers. (VN: Wonder if anyone was there from the United States?)

The committee calls for restrictions on research and use of robotic weapons in their Statement of the 2010 Expert Workshop. (VN: and you just know, that Israel will honor any such global policy restrictions, they proved that with Nuke weapons even further by stealing he technology and the fuel from the US.)

They hope that these parameters will help "to prevent the nightmare, so often foretold, of the loss of human control over the maintenance of security, the use of lethal force and the conduct of war, and of its surrender to an armed, autonomous technology".

However, the committee is not yet as concerned about Skynet waging war on the planet as they are about current robotic weapons.

Dr Sparrow told news.com.au that they would obviously be concerned if someone created and armed an Artificial Intelligence, but that the committee's main concern is with existing systems and "the way in which they have lowered the threshold of conflict".

He said the current robotic weapons "encourage governments to go to war where they otherwise wouldn’t be willing to because they think they can do it without incurring casualties in their own services".

"It gives an illusion of a god-like power that they can kill all the ‘bad’ people and there are few if any political problems that can be solved that way."

Citing Predator and Reaper drones used by the US military and the CIA against targets in Pakistan, Dr Sparrow said that the psychological separation provided by robot weapons between the attacker and their target is "complicated".

"In some ways, the people flying the Predator or the Reaper (the main drones), they see more of the consequences of their actions than someone who’s firing a cruise missile or dropping a bomb," he said.

"They actually see footage of the mangled corpses they leave behind.   "At the same time what they’re doing looks perilously close to playing a video game."

Robot weapons systems such as the drones have a reputation for deadly accuracy, but Dr Sparrow disputes this, saying that they cause significant rates of civilian casualties.

"If someone thinks these are really discriminate weapons systems, the question I ask is 'would you be happy to have them operating over the city of Melbourne?'"

The ICRAC's central members are Dr Sparrow, Professor Sharkey, German physicist and peace researcher Jürgen Altmann and American Professor Peter Asaro.

Read more: http://www.news.com.au/technology/australian-bioethicist-warns-of-robopocalypse/story-e6frfro0-1225934861279#ixzz11ZDrOB4e


The article is reproduced in accordance with Section 107 of title 17 of the Copyright Law of the United States relating to fair-use and is for the purposes of criticism, comment, news reporting, teaching, scholarship, and research.

No comments: