Thursday 17 May 2018

The Guardian/Suchman, Irani & Asaro: Google's march to the business of war must be stopped

The Guardian

Opinion
Google's march to the business of war must be stopped

We stand with thousands of Google employees, demanding an end to its contract with the US Department of Defense

Lucy Suchman, Lilly Irani and Peter Asaro

Wed 16 May 2018 17.33 BST
Last modified on Wed 16 May 2018 18.28 BST

Shares
976
Comments
359
A US remotely piloted aircraft in Iraq, 2015.
‘Should Google proceed despite moral and ethical opposition by several thousand of its own employees?’ A US remotely piloted aircraft in Iraq, 2015. Project Maven uses machine learning and artificial intelligence to analyse the vast amount of footage shot by US drones. Photograph: Cory Payne/USA/Rex/Shutterstock

Should Google, a global company with intimate access to the lives of billions, use its technology to bolster one country’s military dominance? Should it use its state of the art artificial intelligence technologies, its best engineers, its cloud computing services, and the vast personal data that it collects to contribute to programs that advance the development of autonomous weapons? Should it proceed despite moral and ethical opposition by several thousand of its own employees?

Gizmodo reported this week that more than a dozen Google employees have resigned over Google providing AI support to a Pentagon drone program called Project Maven, which aims to improve the ability of drones to identify humans. This follows a public letter, signed by 3,100-plus Google employees who say that Google should not be in the business of war.

We agree with and support those employees and we are joined by more than 700 academic researchers who study digital technologies. We support their demand that Google terminates its contract with the US Department of Defense (DoD), that the company commit not to weaponize the personal data they collect, or support the development of autonomous weapons. We also urge their executives to join other artificial intelligence (AI) and robotics researchers and technology executives in supporting an international treaty to prohibit autonomous weapon systems.

Google has long sought to organize and enhance the usefulness of the world’s information, and along the way it has taken responsibility for collecting our most intimate information, from our personal correspondence to our calendars, to our location data, to our private photos. Being entrusted with such personal information comes with the responsibility to protect it, and to use it carefully, in ways that respect the global makeup of those who contribute these records of their lives.
We work for Google. Our employer shouldn't be in the business of war
Open letter signed by Google employees
Read more

Given this grave responsibility, news of Google’s involvement in the defense department’s Project Maven alarmed many of us who study digital technologies. Maven is a US military program that applies AI to drone surveillance videos for the purpose of detecting “objects of interest”, which are flagged for human analysts. Google is providing not only AI technologies (potentially built in part on the personal data that Google collects), but also engineers and expertise to the DoD. Maven is already being used “in the Middle East” and the project is slated to expand by next summer, eventually being used on blanket surveillance footage from “a sophisticated, hi-tech series of cameras … that can view entire towns”.

Reports on Project Maven currently emphasize the role of human analysts, but the DoD’s ambitions are clear. These technologies are poised to automate the process of identifying targets, including people, and directing weapons to attack them. Defense One reports that the DoD already plans to install image analysis technologies onboard the drones themselves, including armed drones. From there, it is only a short step to autonomous drones authorized to kill without human supervision or meaningful human control. We already lack sufficient oversight and accountability for US drone operations. It’s unlikely that we would know when the US military takes Maven across the threshold from image analysis assistance to fully autonomous drone strikes.

Even without automated targeting, the US drone program has been extremely controversial, with many arguing that targeted killings violate US and international law. Targeted killings include “personality strikes”, on known individuals named on “kill lists”, and “signature strikes” based on “pattern-of-life analysis”, which target people based only on their appearance and behavior in surveillance imagery. As a result, not only are bystanders frequently killed in strikes, but social gatherings of civilians, such as weddings, are sometimes directly targeted. “Every independent investigation of the [drone] strikes,” the New York Times reported in 2013, “has found far more civilian casualties than administration officials admit.”
Are you ready? Here is all the data Facebook and Google have on you
Dylan Curran
Read more

The fact that military funding supported the early development of computing technology does not mean that it must determine the field’s future, particularly given the current power of the tech industry. With Project Maven, Google joins hands with the arguably illegal US drone program, and advances the immoral practice of statistically and algorithmically targeted killings. Google, a global company, has aligned itself with a single nation’s military, developing a technology that could potentially put its users, and their neighbors, at grave risk.

We are at a critical moment. Two months ago, Stanford professor and Google Cloud AI director Fei-Fei Li wrote an op-ed in the New York Times titled How to Make AI That’s Good For People. We call on Google’s leadership to live up to its ethical responsibilities by listening to people who challenge Google to expand their definition of “good”. We call on Google to expand its definition of “people” to include those already subjected to illegal drone strikes and data surveillance.

This week, in response to a question at the I/O developer conference, Google AI chief Jeff Dean stated that he opposes using AI to build autonomous weapons. We call on Google to support ongoing international efforts at the United Nations to ban the development and use of autonomous weapons. We call on Google to respect employees’ right to refuse work they find immoral or unethical. Google’s employees asked their company to leave money on the table and stay true to its words: “Don’t be evil.” Nowhere is this more urgent than in deciding whether to build systems that decide who lives and who dies.

We ask Google to:

    Terminate its Project Maven contract with the DoD.
    Commit not to develop military technologies, nor to allow the personal data it has collected to be used for military operations.
    Pledge to neither participate in nor support the development, manufacture, trade or use of autonomous weapons; and to support efforts to ban autonomous weapons.

• Lucy Suchman is a professor of anthropology of science and technology in the department of sociology at Lancaster University. Lilly Irani is assistant professor of communication, science studies and critical gender studies at the University of California San Diego. Peter Asaro is associate professor in the school of media studies at the New School in New York City
Since you’re here …

… we have a small favour to ask. More people are reading the Guardian than ever but advertising revenues across the media are falling fast. And unlike many news organisations, we haven’t put up a paywall – we want to keep our journalism as open as we can. So you can see why we need to ask for your help. The Guardian’s independent, investigative journalism takes a lot of time, money and hard work to produce. But we do it because we believe our perspective matters – because it might well be your perspective, too.

    I appreciate there not being a paywall: it is more democratic for the media to be available for all and not a commodity to be purchased by a few. I’m happy to make a contribution so others with less means still have access to information.
    Thomasine, Sweden

If everyone who reads our reporting, who likes it, helps fund it, our future would be much more secure. For as little as $1, you can support the Guardian – and it only takes a minute. Thank you.
Support The Guardian
Paypal and credit card
Topics

    Google
    Opinion

    Artificial intelligence (AI)
    US military
    Computing
    US defence spending
    Drones (military)
    comment

    Share on LinkedIn
    Share on Pinterest
    Share on Google+

comments
Your comments are currently being pre-moderated (why?)
Enter comment
Loading comments… Trouble loading?
Most viewed

    The Guardian view
    Columnists
    Cartoons
    Opinion videos
    Letters

back to top

    make a contribution
    subscribe
    securedrop
    help

    advertise with us
    work for us
    contact us
    complaints & corrections

    terms & conditions
    privacy policy
    cookie policy
    digital newspaper archive

    all topics
    all contributors
    Facebook
    Twitter

© 2018 Guardian News and Media Limited or its affiliated companies. All rights reserved.

No comments: