UN / KILLER ROBOTS

Preview Language:   Original
21-Oct-2019 00:02:48
Aiming for the creation of a new ban treaty to establish the principle of meaningful human control over the use of force, Mary Wareham, of the Campaign to Stop Killer Robots said, “we believe the time is ripe, but very soon it will be too late, which is why there is a huge urgency to this.” UNIFEED

Available Language: English
Type
Language
Format
Acquire
/
English
Other Formats
Description
STORY: UN / KILLER ROBOTS
TRT: 02:48
SOURCE: UNIFEED
RESTRICTIONS: NONE
LANGUAGE: ENGLISH / NATS

DATELINE: 21 OCTOBER 2019, NEW YORK CITY / FILE

SHOTLIST:

FILE – RECENT, NEW YORK CITY

1. Wide shot, exterior United Nations headquarters

21 OCTOBER 2019, NEW YORK CITY

2. Wide shot, press conference dais
3. Close up, robot prop
4. SOUNDBITE (English) Mary Wareham, Campaign to Stop Killer Robots:
“There is no country that is vigorously advocating the development of lethal autonomous weapon systems. Even the countries who are investing, most of them will say ‘we have no plans to move down this path.’ Then, well, if you have no plans to develop killer robots, then what’s the problem with creating the treaty. So, we believe the time is ripe, but very soon it will be too late, which is why there is a huge urgency to this.”
5. Med shot, cameras
6. SOUNDBITE (English) Liz O’Sullivan, International Committee for Robot Arms Control:
“If we allow autonomous weapons to deploy and selectively engage with their own targets, we will see disproportionate false fatalities and error rates with people of colour, people with disabilities, anybody who has been excluded from the training sets by virtue of the builders own inherent bias.”
7. Med shot, journalists
8. SOUNDBITE (English) Jody Williams, Nobel Peace Laureate:
“Drones started out, you know, as surveillance equipment, and then suddenly they stuck on some Hellfire missiles, and they were, you know, killer weapons. I think that they were hoping, and really expecting that the larger community would not find out about the research and development of killer robots.”
9. Wide shot, presser
10. SOUNDBITE (English) Jody Williams, Nobel Peace Laureate:
“We need to step back and think about how artificial intelligence robotic weapons systems would affect this planet and the people living on it.”
11. Med shot, journalists
12. SOUNDBITE (English) Mary Wareham, Campaign to Stop Killer Robots:
“At the last diplomatic meeting in August, Russia and the United States were the key problems. They did not want to see any result, from what we could tell. Other countries that are investing heavily into ever increasingly autonomous weapon systems include China, South Korea, Israel, the United Kingdom to some extent; perhaps Turkey, perhaps Iran.”
13. Wide shot, press
14. SOUNDBITE (English) Liz O’Sullivan, International Committee for Robot Arms Control:
“We are exerting what little power we have against tech executives, who have a lot more. And these big companies do have a way to competing with each other to get large government contracts. There is a tonne of money in it, and it’s difficult to turn down a ten-billion-dollar contract that involves, you know, DOD computing for the cloud. These are the kinds of things that we are seeing, which makes it that much more troubling for all of us who don’t want to participate in killer robots or autonomy in weapons systems at all.”
15. Wide shot, end of presser

STORYLINE:

Aiming for the creation of a new ban treaty to establish the principle of meaningful human control over the use of force, Mary Wareham, of the Campaign to Stop Killer Robots, today (21 Oct) said “we believe the time is ripe, but very soon it will be too late, which is why there is a huge urgency to this.”

Wareham told reporters at UN Headquarters that “there is no country that is vigorously advocating the development of lethal autonomous weapon systems,” but added that, “if you have no plans to develop killer robots, then what’s the problem with creating the treaty.”

Liz O’Sullivan, of the International Committee for Robot Arms Control, said, “if we allow autonomous weapons to deploy and selectively engage with their own targets, we will see disproportionate false fatalities and error rates with people of colour, people with disabilities, anybody who has been excluded from the training sets by virtue of the builders own inherent bias.”

For her part, Jody Williams, who received the Nobel Peace Prize in 1997 for her work to ban landmines, noted that “drones started out, you know, as surveillance equipment, and then suddenly they stuck on some Hellfire missiles, and they were, you know, killer weapons.”

She said states “were hoping, and really expecting that the larger community would not find out about the research and development of killer robots.”

Williams said, “we need to step back and think about how artificial intelligence robotic weapons systems would affect this planet and the people living on it.”

Wareham pointed out that during meetings at the UN in Geneva in August, “Russia and the United States were the key problems” as they “did not want to see any result” towards the drafting of a ban treaty.

She said, “other countries that are investing heavily into ever increasingly autonomous weapon systems include China, South Korea, Israel, the United Kingdom to some extent; perhaps Turkey, perhaps Iran.”

O’Sullivan, who is a tech executive at a start-up firm, said “we are exerting what little power we have against tech executives, who have a lot more. And these big companies do have a way to competing with each other to get large government contracts. There is a tonne of money in it, and it’s difficult to turn down a ten-billion-dollar contract that involves, you know, DOD computing for the cloud. These are the kinds of things that we are seeing, which makes it that much more troubling for all of us who don’t want to participate in killer robots or autonomy in weapons systems at all.”

Today’s press encountered was sponsored by the Permanent Mission of Austria to the United Nations and the Campaign to Stop the Robots.
Series
Category
Creator
UNIFEED
Alternate Title
unifeed191021d
Asset ID
2480805