OHCHR / INTERNET CONTENT SHUTDOWNS

14-Jul-2021 00:04:43
Briefing reporters in Geneva, a UN Human Rights Office (OHCHR) official said, “We can, and should, make the internet a safer place, but it doesn’t need to be at the expense of fundamental rights.” UNTV CH
Size
Format
Acquire
680.44 MB
1080p/29.97
680.02 MB
1080i/29.97
677.38 MB
1080i/25
DESCRIPTION
STORY: OHCHR / INTERNET CONTENT SHUTDOWNS
TRT: 04:43
SOURCE: UNTV CH
RESTRICTIONS: NONE
LANGUAGE: ENGLISH / NATS

DATELINE: 14 JULY 2021, GENEVA, SWITZERLAND
SHOTLIST
1. Wide shot, alley of flags Palais des Nations
2. Wide shot, podium room 27
3. SOUNDBITE (English) Peggy Hicks, Director of Thematic Engagements, UN Human Rights Office (OHCHR):
“We want to emphasize that we have the same rights online as offline. But when we look at the digital landscape, and you see a digital world that is unwelcoming and frequently unsafe for people trying to exercise their rights. You also see a host of government and company responses that risk making the situation worse. Recent developments in countries including India, Nigeria, the UK, the US and Vietnam have spotlighted these issues, and will be influential in how our online space evolves.”
4. Wide shot, participants
5. SOUNDBITE (English) Peggy Hicks, Director of Thematic Engagements, UN Human Rights Office (OHCHR):
“Discussions on how to address “lawful but awful” speech online tend to devolve into finger-pointing between States and companies, with political or economic interests often eclipsing public interests.”
6. Wide shot, participants
7. SOUNDBITE (English) Peggy Hicks, Director of Thematic Engagements, UN Human Rights Office (OHCHR):
“We have one overarching message to bring to this debate and that is the critical importance of adopting human rights based approaches to confronting these challenges. It is of course the only internationally agreed framework that allows us to do that effectively. We need to sound a loud and persistent alarm, given the tendency for flawed regulations to be cloned, and bad practices to flourish.”
8.Med shot, participants

9. SOUNDBITE (English) Peggy Hicks, Director of Thematic Engagements, UN Human Rights Office (OHCHR):
“To give you an idea, about 40 new laws relating to social media have been adopted worldwide in just the last 2 years, and another 30 are under consideration.”
10. Colse up, participants
11. SOUNDBITE (English) Peggy Hicks, Director of Thematic Engagements, UN Human Rights Office (OHCHR):
“Virtually every country that has adopted laws relating to online content has jeopardized human rights in doing so. This happens both because governments respond to public pressure by rushing in with simple solutions for complex problems; and because some governments see such this legislation as a way to limit speech they dislike and even silence civil society or other critics.”
12. Wide shot, podium reflection
13. SOUNDBITE (English) Peggy Hicks, Director of Thematic Engagements, UN Human Rights Office (OHCHR):
“Now, in the wake of abhorrent abuse of England football players, there are demands to get that legislation in place more quickly, as if the bill could have somehow protected the players from the racism they faced.”
14. Wide shot, conference room
15. SOUNDBITE (English) Peggy Hicks, Director of Thematic Engagements, UN Human Rights Office (OHCHR):
“There is a better way. We can, and should, make the internet a safer place, but it doesn’t need to be at the expense of fundamental rights”
16. Close up, hand writing
17. SOUNDBITE (English) Marcelo Daher, Human Rights Officer, UN Human Rights Office (OHCHR):
“We’ve outlined five actions that could make a big difference:First focus on process, not content. Look at how content is being amplified or restricted. Ensure actual people – not algorithms - review complex decisions. Second ensure content-based restrictions are based on laws, clear and narrowly-tailored, and are necessary, proportionate and non-discriminatory. Third be transparent. Companies should be transparent about how they curate and moderate content and how they share information with others. States also should be transparent about their requests to take down content or access users’ data. Fourth ensure users have effective opportunities to appeal against decisions, they consider to be unfair, and make good remedies available for when actions by companies or states undermine their rights. Independent courts should have the final say over lawfulness of content. Fifth Make sure civil society and experts are involved in designing and evaluating regulations. Participation is essential.”
18. Wide shot, briefing room
19. SOUNDBITE (English) Marcelo Daher, Human Rights Officer, UN Human Rights Office (OHCHR):
“Of course, the ultimate tool to control online speech are internet shutdowns, including blocking specific apps and the partial or complete shutdown of internet access.”
20. Wide shot, briefing room
SOUNDBITE (English) Marcelo Daher, Human Rights Officer, UN Human Rights Office (OHCHR):
“Shutdowns don’t just interfere with speech - people rely on the internet for their jobs, their health and their education. The impact of shutdowns on elections – when open and safe spaces for public debates and public protests are vital - is particularly serious.”
21. Med shor, participants
22. SOUNDBITE (English) Peggy Hicks, Director of Thematic Engagements, UN Human Rights Office (OHCHR):
“We face competing visions for our privacy, our expression our lives, spurred on by competing economies, and competing businesses. Companies and states alike have all agreed to respect human rights. Let’s start holding them to that.”
23.Wide shot, briefing room
STORYLINE
Briefing reporters in Geneva, a UN Human Rights Office (OHCHR) official said, “We can, and should, make the internet a safer place, but it doesn’t need to be at the expense of fundamental rights.”

A press briefing was held today (14 Jul) in Geneva by UN Human Rights Office on social media content moderation and internet shutdowns.

SOUNDBITE (English) Peggy Hicks, Director of Thematic Engagements, UN Human Rights Office (OHCHR):
“We want to emphasize that we have the same rights online as offline. But when we look at the digital landscape, and you see a digital world that is unwelcoming and frequently unsafe for people trying to exercise their rights. You also see a host of government and company responses that risk making the situation worse. Recent developments in countries including India, Nigeria, the UK, the US and Vietnam have spotlighted these issues, and will be influential in how our online space evolves.”

SOUNDBITE (English) Peggy Hicks, Director of Thematic Engagements, UN Human Rights Office (OHCHR):
“Discussions on how to address “lawful but awful” speech online tend to devolve into finger-pointing between States and companies, with political or economic interests often eclipsing public interests.”

SOUNDBITE (English) Peggy Hicks, Director of Thematic Engagements, UN Human Rights Office (OHCHR):
“We have one overarching message to bring to this debate and that is the critical importance of adopting human rights based approaches to confronting these challenges. It is of course the only internationally agreed framework that allows us to do that effectively. We need to sound a loud and persistent alarm, given the tendency for flawed regulations to be cloned, and bad practices to flourish.”

SOUNDBITE (English) Peggy Hicks, Director of Thematic Engagements, UN Human Rights Office (OHCHR):
“To give you an idea, about 40 new laws relating to social media have been adopted worldwide in just the last 2 years, and another 30 are under consideration.”

SOUNDBITE (English) Peggy Hicks, Director of Thematic Engagements, UN Human Rights Office (OHCHR):
“Virtually every country that has adopted laws relating to online content has jeopardized human rights in doing so . This happens both because governments respond to public pressure by rushing in with simple solutions for complex problems; and because some governments see such this legislation as a way to limit speech they dislike and even silence civil society or other critics.”
Let’s take Vietnam’s 2019 Law on Cybersecurity, which prohibited conduct includes “distorting history, denying revolutionary achievements,” and “providing false information, causing confusion amongst the citizens, causing harm to socioeconomic activities.”
That legislation has been used to force deletion of posts, and many of those voicing critical opinions have been arrested and detained. Facebook initially challenged government orders to take down content, but reportedly has now agreed to restrict “significantly more content”, apparently as a condition for continuing to do business in Vietnam. In June, Vietnam adopted a new Social Media Code was adopted that prohibits posts that, for example “affect the interests of the state.”
Laws in Australia, Bangladesh, Singapore and many other locations include overbroad or ill-defined language of this sort.

And the list keeps growing. The United Kingdom in May tabled its draft Online Safety Bill which has a worryingly overbroad standard that makes the removal of significant amounts of protected speech, i.e. speech that international law should be permitted almost inevitable.

SOUNDBITE (English) Peggy Hicks, Director of Thematic Engagements, UN Human Rights Office (OHCHR):
“Now, in the wake of abhorrent abuse of England football players, there are demands to get that legislation in place more quickly, as if the bill could have somehow protected the players from the racism they faced.”

These laws in general suffer from many of the same problems 5 different ones: Poor definitions of what constitutes unlawful or harmful content; Outsourcing regulatory functions to companies; Over-emphasis on content take-downs and the imposition of unrealistic timelines; Powers granted to state officials to remove content without judicial oversight : And also over-reliance on artificial intelligence / algorithms.

SOUNDBITE (English) Peggy Hicks, Director of Thematic Engagements, UN Human Rights Office (OHCHR):
“There is a better way. We can, and should, make the internet a safer place, but it doesn’t need to be at the expense of fundamental rights”

SOUNDBITE (English) Marcelo Daher, Human Rights Officer, UN Human Rights Office (OHCHR):
“We’ve outlined five actions that could make a big difference:First focus on process, not content. Look at how content is being amplified or restricted. Ensure actual people – not algorithms - review complex decisions. Second ensure content-based restrictions are based on laws, clear and narrowly-tailored, and are necessary, proportionate and non-discriminatory. Third be transparent. Companies should be transparent about how they curate and moderate content and how they share information with others. States also should be transparent about their requests to take down content or access users’ data. Fourth ensure users have effective opportunities to appeal against decisions, they consider to be unfair, and make good remedies available for when actions by companies or states undermine their rights. Independent courts should have the final say over lawfulness of content. Fifth Make sure civil society and experts are involved in designing and evaluating regulations. Participation is essential.”

The EU is currently considering what promises to be a landmark law in this space – its Digital Services Act – and the choices made in that legislation could have ripple effects worldwide. The draft being debated has some very positive elements: it is grounded in human rights language, it is being developed through a participatory process and contains clear transparency requirements for platforms. Yet, some contradictory signals remain, including with risks for imposing over-broad liability on companies for user-generated content and limited judicial oversight.

SOUNDBITE (English) Marcelo Daher, Human Rights Officer, UN Human Rights Office (OHCHR):
“We are also concerned about other approaches to regulating companies including requirements for legal representation mixed with threats of criminal liability, data storage and access, and taxation.”

India has faced serious incidents of incitement to violence online, clearly a factor in recent attempts to regulate online space. In February, India unveiled new Guidelines for Intermediaries and a Digital Media Ethics Code. This new law introduces some useful obligations for companies relating to transparency and redress, but a number of provisions raise significant concerns, including those empowering non-judicial authorities to request quick take-downs, obliging platforms to identify originators of messages and stipulating that companies must appoint local representatives whose potential liability could threaten their ability to protect speech, and even to operate. The threat of limiting protected speech and privacy has already surfaced, including in legal disputes with both Twitter and WhatsApp in the past month, which are now before the Indian courts.

A number of other countries have introduced or are considering extensive requirements for platforms to operate, including amendments adopted last year in Turkey, a ministerial regulation in Indonesia, or a recent proposal in Mexico, the impact of which varies significantly based on the political and legal context in which they are being enforced.

SOUNDBITE (English) Marcelo Daher, Human Rights Officer, UN Human Rights Office (OHCHR):
“Of course, the ultimate tool to control online speech are internet shutdowns, including blocking specific apps and the partial or complete shutdown of internet access internet.”

Access Now’s #KeepItOn campaign has documented 155 shutdowns in some 30 countries during 2020, and 50 shutdowns already through May this year. The High Commissioner has expressed concerns on shutdowns in Belarus, Myanmar, Tanzania, and Iran.

In June, the Nigerian Government announced the “indefinite suspension” of Twitter after the platform deleted a post from President Buhari’s account that Twitter said violated its policy on abusive behaviour. Within hours, the country’s major telecommunications companies blocked millions from accessing Twitter, and later Nigerian authorities threatened to prosecute Nigerians who bypassed the ban. The ECOWAS Court of Justice has reportedly ordered Nigeria to refrain from such prosecutions pending the outcome of a case filed against the Twitter ban by a Nigerian civil society organization.

SOUNDBITE (English) Marcelo Daher, Human Rights Officer, UN Human Rights Office (OHCHR):
“Shutdowns don’t just interfere with speech - people rely on the internet for their jobs, their health and their education. The impact of shutdowns on elections – when open and safe spaces for public debates and public protests are vital - is particularly serious.”

In the face of all these challenges, social media companies have become something of a punching bag for everything that is wrong online – they are harshly criticized both for failing to takedown harmful content, and often suffer equally severe abuse for doing so.

Much of this criticism is justified – the companies open themselves up for such complaints by their ill-defined and opaque policies and processes.

Nowhere was this more true than its review of Facebook’s decision to subject former US President Trump to an “indefinite suspension” of his account. The Board’s decision led Facebook to admit that it had not applied its “political figures” exception to President Trump’s account, explain a “cross-checks” process we had not known existed, and provide information on how its “strikes” policy works.

Another huge gap relates to the opaque and insufficient avenues people have to challenge company content moderation decisions. For example, last May when conflict began in Israel and Palestine, critics alleged that Palestinian voices were disproportionately affected by social media company actions. Issues faced in automated moderation systems were acknowledged by Facebook, including inadvertently classifying content referring to the Al Aqsa Mosque as associated with terrorism given that that Al Aqsa brigades is listed by the US as a terrorist group.

The actions companies take should be proportionate to the severity of the risk. Their options include a range of measures, not just take-downs but flagging content, limiting amplification, and attaching warning labels. Companies need to do much more to be transparent and actively share information about their actions and company policies and processes

Social media companies also need to grapple with how they address content moderation issues globally. Context is essential to understanding the potential of speech to incite violence. While Facebook’s experience in Myanmar led to greater investment in what it calls other “at-risk countries”, social media companies’ capacity to understand language, politics and society in many locations across the globe remains limited.

The UN Guiding Principles on Business and Human Rights also make clear that governments have a duty to protect against human rights abuses by companies. Given the propensity for online harm, States need to put in place effective guardrails for company actions, including by requiring greater transparency and accountability. But the dangers of overstepping the mark are painfully clear, as countless people currently detained for online posts that contained protected speech make clear. State regulation, if done precipitously, sloppily or with ill intent can easily consolidate undemocratic and discriminatory approaches that limit free speech, suppress dissent, and undermine a variety of other rights.

SOUNDBITE (English) Peggy Hicks, Director of Thematic Engagements, UN Human Rights Office (OHCHR):
“We face competing visions for our privacy, our expression our lives, spurred on by competing economies, and competing businesses. Companies and states alike have all agreed to respect human rights. Let’s start holding them to that.”
Category
Topical Subjects
Source
Alternate Title
unifeed210714d