Monday, 9 April 2018

The Guardian/Benjamin Haas: 'Killer robots': AI experts call for boycott over lab at South Korea university

The Guardian

Artificial intelligence (AI)
'Killer robots': AI experts call for boycott over lab at South Korea university

Academics around the world voice ‘huge concern’ over KAIST’s collaboration with defence company on autonomous weapons

Benjamin Haas in Seoul @haasbenjamin

Thu 5 Apr 2018 09.20 BST
First published on Thu 5 Apr 2018 04.25 BST

Shares
5,090
Comments
1,032
terminator model
More than 20 countries have already called for a total ban on killer robots ahead of a UN meeting next week on autonomous weapons. Photograph: Stephen Curry for the Guardian

Artificial intelligence researchers from nearly 30 countries are boycotting a South Korean university over concerns a new lab in partnership with a leading defence company could lead to “killer robots”.

More than 50 leading academics signed the letter calling for a boycott of Korea Advanced Institute of Science and Technology (KAIST) and its partner, defence manufacturer Hanwha Systems. The researchers said they would not collaborate with the university or host visitors from KAIST over fears it sought to “accelerate the arms race to develop” autonomous weapons.

“There are plenty of great things you can do with AI that save lives, including in a military context, but to openly declare the goal is to develop autonomous weapons and have a partner like this sparks huge concern,” said Toby Walsh, the organiser of the boycott and a professor at the University of New South Wales. “This is a very respected university partnering with a very ethically dubious partner that continues to violate international norms.”

    Developing autonomous weapons would make the security situation on the Korean peninsula worse.
    Toby Walsh, professor at University of New South Wales

The boycott comes ahead of a United Nations meeting in Geneva next week on autonomous weapons, and more than 20 countries have already called for a total ban on killer robots. The use of AI in militaries around the world has sparked fears of a Terminator-like situation and questions have been raised about the accuracy of such weapons and their ability to distinguish friend from foe.

Hanwha is one of South Korea’s largest weapons manufacturers, and makes cluster munitions which are banned in 120 countries under an international treaty. South Korea, along with the US, Russia and China, are not signatories to the convention.
South Korea university demonstrates people-carrying robot – video

Walsh was initially concerned when a Korea Times article described KAIST as “joining the global competition to develop autonomous arms” and promptly wrote to the university asking questions but did not receive a response.

KAIST’s president, Sung-Chul Shin, said he was saddened to hear of the boycott. “I would like to reaffirm that KAIST does not have any intention to engage in development of lethal autonomous weapons systems and killer robots,” Shin said in a statement.

“As an academic institution, we value human rights and ethical standards to a very high degree,” he added. “I reaffirm once again that KAIST will not conduct any research activities counter to human dignity including autonomous weapons lacking meaningful human control.”
Profile
What is Hanwha Systems?

KAIST opened the research centre for the convergence of national defence and artificial intelligence on 20 February, with Shin saying at the time it would “provide a strong foundation for developing national defence technology”.

The centre will focus on “AI-based command and decision systems, composite navigation algorithms for mega-scale unmanned undersea vehicles, AI-based smart aircraft training systems, and AI-based smart object tracking and recognition technology”, the since-deleted announcement said.

South Korea’s Dodaam Systems already manufactures a fully autonomous “combat robot”, a stationary turret, capable of detecting targets up to 3km away. Customers include the United Arab Emirates and Qatar and it has been tested on the highly militarised border with North Korea, but company executives told the BBC in 2015 there were “self-imposed restrictions” that required a human to deliver a lethal attack.

The Taranis military drone built by the UK’s BAE Systems can technically operate entirely autonomously, according to Walsh, who said killer robots made everyone less safe, even in a dangerous neighbourhood.

“Developing autonomous weapons would make the security situation on the Korean peninsula worse, not better,” he said. “If these weapons get made anywhere, eventually they would certainly turn up in North Korea and they would have no qualms about using them against the South.”
Topics

    Artificial intelligence (AI)

    South Korea
    Robots
    Asia Pacific
    Computing
    news

    Share on LinkedIn
    Share on Pinterest
    Share on Google+

comments
Loading comments… Trouble loading?
Most viewed

    World
    UK
    Science
    Cities
    Global development
    Football
    Tech
    Business
    Environment
    Obituaries

back to top

    become a supporter
    make a contribution
    securedrop
    help

    advertise with us
    work for us
    contact us
    complaints & corrections

    terms & conditions
    privacy policy
    cookie policy
    digital newspaper archive

    all topics
    all contributors
    Facebook
    Twitter

© 2018 Guardian News and Media Limited or its affiliated companies. All rights reserved.

No comments: