View this email in your browser
Volume 3 - December 3, 2020
We are thrilled to announce that Charlotte Willner will be the founding Executive Director of the Trust & Safety Professional Association (TSPA) and its sibling organization, the Trust & Safety Foundation Project (TSF)!

We are excited for Charlotte to lead TSPA and TSF because of her pioneering work shaping the field of online trust and safety for more than a decade. Charlotte joins TSPA and TSF from Pinterest, where she has been the Head of Trust and Safety Operations, overseeing online safety, law enforcement response, and intellectual property matters. She previously led international support, and built out the first Safety Operations team, at Facebook. Charlotte has made a huge impact in her in-house roles in trust and safety, and we’re excited to see her continue to shape the broader community too. Click here for the full announcement.
We are also excited that Fatima Alam has joined us as a new advisor!

Fatima is a Global Compliance Manager at Netflix, where she works on content classification across a Global Content Regulation and Standards Board portfolio. Fatima previously worked in various roles on the Trust and Safety and Public Policy teams at Google in the US and India, where she supported and led the research, development, and enforcement of product policies related to controversial and harmful online content. Previously, she was also a Human Rights and Technology Fellow at the Harvard Carr Center.


TSPA co-founders and board members, Adelin Cai and Clara Tsao, recently sat down with Webpurify and to discuss the basics of content moderation. Click the links below to check out the full articles.

Best Practices for Trust and Safety Officers: An Interview with the TSPA’s Co-founders

How Content Moderation Can Make Online Spaces Safer for Your Child

Trust & Safety in the Wild
Can Social Media Platforms Stop Electoral Disinformation and Respect Free Speech?

Human Rights Watch ( discussed the challenges of online content moderation and elections in their report on October 30, 2020. 

Facebook, Twitter, and YouTube, among others, have expanded and refined their policies to fight foreign interference and stem the spread of misinformation and disinformation intended to suppress the vote and delegitimize election results. These changes include removing content and accounts that violate policies, as well as labeling content that is misleading and providing corrective information provided by fact-checkers.

Some of the challenges being faced are that the policies are not uniform across platforms which leads to leaving too much up for interpretation, causing some misinformation to be viewed and political expression to be removed. 

Click here for the full story.

For more top Trust & Safety news, check out the links below:

YouTube, Facebook and Twitter Align to Fight Covid Vaccine Conspiracies

The Perils of Moderating Depression on Social Media

TikTok Lures Facebook Content Moderators to its Trust and Safety Hubs

Facebook Content Moderators Demand Better Coronavirus Protections
Sensitive Mental Health Information is also a Content Moderation Challenge
August 19, 2020 - Mike Masnick/Copia Institute

The Trust & Safety Foundation is partnering with the Copia Institute to publish case studies showing how difficult it is to make online trust and safety decisions. One recent case study focused on Talkspace's business model raising questions about how the self-proclaimed data-focused company is managing sensitive user information, especially if it is reviewed by humans.

Click here for the full case study. Responds After a Teen’s Suicide is Linked to Bullying on the Site
August 19, 2020 - Mike Masnick/Copia Institute

This case study focused on the public outcry over a teen's suicide, linked to harassment by anonymous accounts on, resulting in product updates and structural management changes.

Click here for the full case study.
Community Corner

The Internet Law and Policy Foundry (ILPF) is a collaborative organization for Internet law and policy professionals. Check out the ILPF job board for a list of open trust and safety opportunities here.

Tech Against Terrorism Webinar: Content Moderation: Alternatives to Content Removal
On December 16th, join Tech Against Terrorism for an e-learning webinar organized in partnership with the Global Internet Forum to Counter Terrorism (GIFCT) focused on strategies deployed by tech companies to ensure an efficient and appropriate moderation of platforms without solely relying on content removal. In doing so, this workshop will question the efficiency and challenges related to content removal and de-platforming for terrorist and violent extremist material and actors, weighing this up in relation to other moderation strategies. Speakers will include a representative member of the Trust and Safety Professional Association, Facebook's Oversight Board, and Minds. Click here to register.

Workshop: How Trust & Safety Policies and Processes are Developed
Join us at 9:00 AM PT on December 17th for a one hour workshop focused on how trust and safety policies and processes are developed. Click here to register.
JD Advantage Careers in Trust & Safety Recording

The SCU Internet Law Student Organization (ILSO) recently hosted a discussion about JD Advantage careers in Trust and Safety. This panel was also co-presented by the Internet Law and Policy Foundry (ILPF) and the Trust and Safety Professional Association (TSPA). The conversation explores some of the typical JD Advantage roles in Trust and Safety, explains how these roles differ from traditional counsel roles, and offers some meaningful next steps for candidates pursuing legal-adjacent careers in Trust and Safety. Click here to check out the recorded discussion.
Visit Our Website
Copyright © 2020 Trust & Safety Professional Association, All rights reserved.

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.

Email Marketing Powered by Mailchimp