Miscellaneous

Administrator

Administrator
Staff member

Killer Robots - Drone Strikes of the Future?

Published on Nov 23, 2012

"Despite a lack of public awareness and public debate a number of governments, including European states, are pushing forward with the development of fully autonomous weapons - also known as killer robots. These are weapon systems that will function without any human intervention. The armed robot itself will select its target and will determine when to fire. This is a frighteningly dangerous path to follow in terms of the need to protect civilians during armed conflict.

Killer robots would be unable to distinguish adequately between combatants and civilians in the increasingly complex circumstances of modern battlefields, and would be unable to make proper proportionality determinations. That is, whether the military advantages of an attack exceed the potential harm to civilians. Giving machines the power to decide who lives and dies on the battlefield would take technology too far. Killer robots would lack the human qualities necessary to protect civilians and comply with international humanitarian law. They would lack the ability to relate to humans and to apply human judgment."*

Ana Kasparian and John Iadarola (Host, TYT University) discuss the "killer robots"-- deadly autonomous weapons in development now. What will happen if these go unregulated? Will they become the drone warfare of the future?

*Read more from Steve Goose/ Human Rights Watch:
"The Future of Global Warfare: Killer Robots"

November 20, 2012
 

The Dawn of Killer Robots

Published on Apr 16, 2015

In INHUMAN KIND, Motherboard gains exclusive access to a small fleet of US Army bomb disposal robots—the same platforms the military has weaponized—and to a pair of DARPA’s six-foot-tall bipedal humanoid robots. We also meet Nobel Peace Prize winner Jody Williams, renowned physicist Max Tegmark, and others who grapple with the specter of artificial intelligence, killer robots, and a technological precedent forged in the atomic age. It’s a story about the evolving relationship between humans and robots, and what AI in machines bodes for the future of war and the human race.
 

Autonomous Weapons: Information Technology and the Arms Race
April 27, 2015

Mark Gubrud, PhD, a member of the International Committee for Robot Arms Control, delivered a talk titled “Autonomous Weapons: Information Technology and the Arms Race” at 6 p.m. April 7 in Manning Hall 209. The talk was sponsored by the UNC School of Information and Library Science (SILS), the UNC Curriculum in Peace, War, and Defense (PWAD), and the Triangle Institute for Security Studies (TISS). For more information, visit sils.unc.edu/events/2015/Gubrud-AWS.
 

War machines are developing faster than our ability to regulate them

Published on Jun 5, 2015

Where the 20th century was an era dominated by organizational hierarchies, the 21st century is all about networks. Fussell is a co-author of the McChrystal Group's best-selling book Team of Teams: New Rules of Engagement for a Complex World.

Read more at BigThink.com: bigthink.com/videos/the-future-battlefield

Transcript - The 20th century was all about hierarchies. If you want to create something, if you want to start a country, create a product, whatever it is. Your goal is to create a highly efficient hierarchical model, scale it because that’s what the competition’s doing. And whoever does that the largest and with the most efficiency will eventually dominate the market, will be the dominant country, however you want to look at it. Everyone played some version of this game. The 21st century is dominated by networks because the introduction of the information age we can suddenly create free flow these globally distributed organic shaped networks of individuals. It’s a radically different environment for everyone. That translates into any space that you can imagine really. Everyone’s wrestling with some version of this because we grew up in the bureaucratic model and so we’re trying to change not just the way we act but our psychology and how we view the world. And it’s going to change the battlefield as well. You know it’s inevitable – the technology curve continues to grow exponentially. One of the major areas we’re seeing that is the debate around unmanned vehicles.

So is a completely robotic battlefield out of the question at some point? No, I think it’s out of the question not to think about that as a possible end state. We’re so on the front edge of these debates that it’ll be laughable I imagine 100 years from now. But the fascinating part is if you look at the discussions around this type of technology for the most part our nation states are still trying to solve it through their traditional bureaucratic thinking. How do I legislate for this? What does it look like, et cetera, et cetera. And there’s going to just be an exponential change in how this has the real effects on the ground as the technology continues to grow. So now we have, you know, a single predator type overhead aircraft, unmanned, that can do, you know, X, Y, Z. A very, very significant jump over the past 20 years. Fast forward that 20 years and as the technology scale continues to increase exponentially that could be a single aircraft that has a network of thousands around it that are real time monitoring on the ground, in the air, buildings, whatever the case may be. Where the technology is pushing conflict is moving so much faster than our systems ability to adapt and regulate it that it’s going to be a real challenge for us the next ten to 15 years.
 

Published on Jan 28, 2014

The US military and the Defense Advanced Research Projects Agency (DARPA) are working on next-generation lethal robots to use in war. But what happens when the robots can decide when to kill on their own?
 

Robot Warriors, Terrorists, & Private Contractors: What Future for the Laws of War?
December 13, 2013

Professor Noam Lubell, from our School of Law, looks at the 'war on terror', the rising using of private contractors, and dramatic technological advances and explores how these developments are challenging our ability to regulate armed conflicts, and examines whether the laws of war are capable of fulfilling their purpose.
 

Fireside: Escalating Drone War Unnoticed

Uploaded on Nov 4, 2011

A Wall Street Journal article titled, "US Tightens Drone Rules" says the CIA has made a series of secret concessions in its drone campaign, after military and diplomatic officials complained that large strikes were damaging the fragile US relationship with Pakistan. But one of the biggest problems is the CIA doesn't even publicly acknowledge its drone strikes. And for the most part, very few in the US notice because records of strikes are kept secret by the government. While the WSJ reports that during the summer it was decided that there should be new rules for drone strikes and that they should be launched more selectively. Where's the proof of any of those changes?
 

America's ex-drone pilot

Published on Aug 6, 2015

Brandon Bryant, a former drone pilot and sensor operator for the of the US Air Force, quit his job after 5 years of being in the Drone Program left him emotionally traumatized.

In this episode of Transmissions, Motherboard speaks with Brandon about his feelings of responsibility for the remote killings of people with predator drones, its connection to Germany's drone program, and why ultimately drone warfare makes us lose our humanity.
 

Letting robots kill without human supervision could save lives

Published on Nov 13, 2017

Calls to ban killer robots ignore the fact that human soldiers can make lethal mistakes. If driverless cars will save lives, perhaps armed machines can as well.

Article "Letting robots kill without human supervision could save lives"
Calls to ban killer robots ignore the fact that human soldiers can make lethal mistakes. If driverless cars will save lives, perhaps armed machines can as well

by David Hambling
November 8, 2017
 

No country would be safe from fully autonomous weapons

Published on Apr 5, 2018

The Campaign to Stop Killer Robots and thousands of artificial intelligence experts call for a ban on fully autonomous weapons. Such weapons would be able to identify, select and attack without further human intervention. They are just around the corner. We must act now, before it's too late.
 

Does drone warfare reduce harm? Maybe not. | Abigail Blanco | Big Think

Feb 3, 2020

There has been a huge increase in drone usage since the war on terror. Proponents of drone warfare claim it reduces civilian casualties and collateral damage, that it's cheaper than conventional warfare tactics, and that it's safer for U.S. military personnel.

The data suggests those claims may be false, says scholar Abigail Blanco. Drones are, at best, about equivalent to conventional technologies, but in some cases may actually be worse.

Blanco explains how skewed US government definitions don't give honest data on civilian casualties. Drone operators also suffer worse psychological repercussions following a drone strike because of factors such as the intimacy of prolonged surveillance and heat-sensing technology which lets the operator observe the heat leaving a dying body to confirm a kill.
----------------------------------------------------------------------------------
ABIGAIL BLANCO

Abigail R. Hall is an Assistant Professor of Economics at the University of Tampa. She is the co-author of Tyranny Comes Home: The Domestic Fate of U.S. Militarism (2018, Stanford University Press). She is also an Affiliated Scholar with the Mercatus Center at George Mason University, an Affiliated Scholar with the Foundation for Economic Education and Research Fellow at the Independent Institute.
----------------------------------------------------------------------------------
TRANSCRIPT: People have often pointed to technology as a means to harm reduction. In particular, if we look at the expansion of unmanned aerial vehicles, colloquially known as drones, particularly in the war on terror. So we see a huge increase in the use of drones in foreign conflict. And typically we see that proponents of this type of technology make a variety of different claims as to the benefits of this technology. So things like: it reduces civilian casualties and collateral damage. It's cheaper in a monetary sense than conventional warfare tactics. But then also make claims like well, it's safer or preferable for U.S. military personnel. And while we don't have a robust amount of data on this topic what we do have suggests that on all of these margins, drones are at best about equivalent to conventional technologies, but in some cases may actually be worse.
So UAVs have a higher failure rate than conventional aircraft, for example, as opposed to being surgically precise which is often the terminology that's used by leaders. This technology is only as good as the intelligence that drives it. And that intelligence is often very poor. And so the data surrounding things like civilian casualty rates are not robust. They're not reliable at all. The U.S. government, for instance, has made claims that only a handful of civilian casualties, for instance, have occurred as the result of drone strikes. However, you run into problems when you find out things like they define a militant as any military aged male within a strike zone. So that is roughly about like 15 to 65. So, of course, you're going to have casualty rates or civilian casualty rates that look relatively low if that's the case.

What's most interesting, I think, is if people are really focused on the supposed benefits to U.S. military personnel, is the following data. Unmanned aerial vehicles actually take more personnel on the ground to operate than a conventional military aircraft. That is because they have to—or, at this point, they require a number of individuals within the range that they're operating. And so they also have to be guarded when they're not flying and so this places a variety of personnel within harm's way as opposed to conventional military aircraft which you can launch from an aircraft carrier. There's also some really interesting studies that are being conducted in psychology looking at the psychological effects of the use of UAVs on UAV pilots and actually finding a comparable or even higher rates of things like post-traumatic stress disorder and also a variety of other psychological problems because of the way that drone warfare is conducted as opposed to conventional warfare.

If you are a UAV pilot, you are watching your target for a prolonged period of time. And so you observe that target, you can see when he's going to the grocery store or you observe him with his family. And then the strike is conducted. But then when the strike is conducted the drone doesn't leave. You're talking about technology that can take a clear photograph of a coffee cup or something really small—from 30,000 feet, it can take a clear picture like three feet off the ground. It's remarkable technology in that way. So they're watching these individuals for a prolonged period of time but then after the strike occurs they're interested in having additional information. And so they watch.
 
Back
Top