View this email in your browser
Volume 2 - October 29, 2020
On November 4 (10am-11am PT), join Clara and Adelin (TSPA co-founders) on a webinar hosted by Marketplace Risk as part of their "Everything you need to know about..." series. Clara and Adelin will be talking about top 10 myths in content moderation. Register to attend here.
Trust & Safety in the Wild

Facebook’s Oversight Board Opens for Business: Here’s what you need to know

CNET first reported on September 25, 2020 that Facebook had officially launched their Oversight board, which is a diverse group of currently about 20 independent lawyers, former judges, journalists and professors from over 30 different countries. Facebook and Instagram users are able to submit cases to the board for their review if they disagree with the social network's removal of their content. In the future, the board will also be able to review content that was left up by the social network but may be generating user complaints. Lastly, the Oversight board members can also decide to uphold or overturn Facebook's decisions and the social network will be bound by that ruling.
The board will not act as content moderators, who make decisions on whether individual posts comply with the social network's rules, but rather acts as a court to hear appeals by users who feel their posts were removed improperly. The board exists more to support the "right to free expression" of Facebook's 2.7 billions users.
There have been many questions on whether the decisions the company makes serve its users or itself. Making the board independent of Facebook should, the company expects, give people confidence that its decisions are being made on the merits of the situation, not on the basis of the company's interests.

For the full story, click here.

For more top Trust & Safety news, check out the links below:

YouTube Brings Back More Human Moderators After AI Systems Over-monitor

Facebook’s Content Moderation Errors Are Costing Africa Too Much

A Misinformation Test for Social Media

Facebook, Google, Twitter CEOs Clash with Congress in Pre-election Showdown
Case Studies

Suppressing Content to Try to Stop Bullying
August 26, 2020 - Mike Masnick/Copia Institute

The Trust & Safety Foundation is partnering with the Copia Institute to publish case studies showing how difficult it is to make online trust and safety decisions. One recent case study focused on TikTok limiting the visibility of content created by certain types of users to protect them from being bullied.

Click here for the full case study.

Handling Off Platform Bullying on Platform
September 2, 2020 - Mike Masnick/Copia Institute

This case study focused on Twitch expanding its scope of investigating harassment to include abuse that happens off-Twitch, leading to suspensions of some users.

Click here for the full case study.

The Internet & Jurisdiction Policy Network is a multistakeholder organization addressing the tension between the cross-border Internet and national jurisdictions. They recently published a document that highlights operational approaches to cross-border legal challenges, relevant research for trust and safety professionals concerned with understanding the current regulatory context. I&J continues to do in-depth analysis of how different norms, actors, and technology interact. To read the full document, click the link below.
Content & Jurisdiction Program: Operational Approaches, Norms, Criteria, Mechanisms
April 2019 - Internet & Jurisdiction Policy Network
The Center for Democracy and Technology (CDT) has worked on content moderation and related issues, including intermediary liability and the role of artificial intelligence and automated decision-making systems, since its inception. Learn more about the past and current work they're doing to give input, shape policy, and push for greater transparency around practices that impact user rights.
Community Corner

The Internet Law and Policy Foundry (ILPF) is a collaborative organization for Internet law and policy professionals. Check out the ILPF job board for a list of open trust and safety opportunities here.

The Technology Coalition was formed in 2006 and is comprised of tech industry leaders who are represented by individuals who specialize in online child safety issues. They are currently looking for candidates for their open Executive Director position, for more details, click here.
We Need Your Help

Feeling creative? Help us come up with a name for our monthly newsletter! Fill out the Google form here with your suggestion.

Later this Fall, we will be hosting a virtual event focused on "How T&S Policies and Processes are Developed". This event will be available to TSPA members and non-members. Please fill out the form here if you have an interest in attending the upcoming event!

On November 19 (5pm PT), TSPA, the Internet Law and Policy Foundry (ILPF), and the SCU Internet Law Student Organization (ILSO) will co-present a virtual panel. This discussion will focus on careers in Trust and Safety and content moderation with an emphasis on JD Advantage roles and opportunities for students graduating with a law degree. Registration details to follow.
Election Workshop Recap

Earlier this month, TSPA hosted a 1.5 hour workshop with 45 professionals from TSPA founding corporate supporters focused on 2016 VS 2020: Protecting Democracy During an Era of Disinformation. The roundtable featured expert speakers including:

Renee DiResta, Stanford Internet Observatory Researcher 
David Agranovich, Global Threat Disruption Lead at Facebook
Lee Foster, FireEye Influence Operations Lead 
Camille Francois, Disinformation Expert 
Micah Schaffer, Founding member of YouTube’s Content Policy Team 
Rob Schaul, US Countering Foreign Influence Task Force
Graham Brookie, Director of the Digital Forensic Research Lab 

There were 6 breakout groups formed that discussed the following topics:
  • How Concerned Should We Be About Russia/Iran/China
  • Lessons Learned from Political Advertising, Content Labeling, and Current Events
  • Moderating Disinformation from Political/Influential Figures
  • Should We Be Worried About Deep Fakes and Other Emerging Threats During Elections
  • Federal and Local Responses to Election Security
  • Incorporating Open Source Researchers and Outside Experts Towards Election Disinformation
TSPA will be sharing a more detailed event recap and resource summary in the weeks to come.
Visit Our Website
Copyright © 2020 Trust & Safety Professional Association, All rights reserved.

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.

Email Marketing Powered by Mailchimp