Phil is currently the Global Lead of Trust & Safety at TaskUs; a global digital outsourcer headquartered in Texas that focuses on providing content moderation services and technology to augment trust and safety across online platforms for high-growth technology companies across various categories including social, gaming, social, dating and fin-tech.
After 15 years in the space, Phil holds a rare perspective having worn several hats in the realm of risk and fraud, managing various customer-facing business operations with companies such as Cognizant and Twitter.
For Phil, who grew up in Johannesburg, everything started when he was 16 with the 1994 election in South Africa, which saw Nelson Mandela become its first democratically elected leader. This landmark event coincided with the advent of the internet in the region, popping the information bubble South Africa had been trapped in, and opening up its citizens to the rest of the world.
As he recalls in the conversation with Tiffany, “The internet consisted of chats, and forums and bulletin boards. And that's where I kind of spent my time.” Being on those platforms, Phil quickly noticed the huge amount of eye-opening information that was being circulated. In South Africa, dangerous information equated to anything that counteracted the government’s propaganda, and the information that was being provided was a revelation for its population. “It just blew my brain, I fell in love with the idea that information and knowledge could transform an individual and transform society and that was the key to accessing education. At its core, the internet is access to information and equitable access to information.”
However, Phil could immediately see the downside to the freedom the Internet brought when there is the unfettered ability to provide and receive information. Platforms and online communities can quickly become black holes of abuse, exploitation, and attack.
Some of these dynamics became evident during his early role as a content moderator at Twitter. “The big problems we were trying to solve in those days were really spam and porn.” But, over the years, the trust and safety attacks became more radical and more complex, including how to navigate them in the many, many languages outside of English. The motivations of the bad actors have also changed from initially being financially driven, to today, where truth has rapidly become distorted as a vector for pushing personal or political agendas and online speech can quickly turned into real-world violence. The fact that access to the internet has spread to areas of the world where free speech has been traditionally restricted, has also played a role in amplifying those episodes of violence.
As the technologies and accessibility evolved, the format of content has also evolved which means there are more dark corners of the internet and the definition of toxicity is constantly changing. Moderation is now needed across, for example, live audio, memes, short-form video, and community rooms. As Tiffany adds, “it has since become an arms race out there.”
As the definition and manifestation of toxicity changes across time and space, it makes finding the right content moderation approach even more important. Today, effective content moderation involves maintaining the right balance between technology and human input as the core mechanisms for your efforts.
For example, while AI can undoubtedly offer the right tools and necessary scalability for moderation to take place, it is human beings who are the real judgment-makers and who can and should take a hyperlocal approach to laws and customs. It is far too risky to simply copy and paste a Western model of content moderation onto behavior in other jurisdictions.
Phil also adds that, "you need to make sure that the staff are employed within markets where you operate. For example, if you have folks in India doing content moderation, they should be speaking to psychologists and mental health professionals in their market, because they understand the local health care rules, they understand the local ethical approach to mental health in the workplace, they understand the cultural nuances.”
And with technology, comes a unique set of challenges. A key issue facing the industry right now is data flow where a client working with multiple partners is controlling and holding back their data, resulting in an disconnected system where critical information is not able to be shared or acted on. The closer that ecosystem can work together in a symbiotic way, the better trust and safety can be implemented.
Finally, you have to have people looking outside of what some would consider the “mainstream” online communities to understand what might be coming next. Keep an eye on, for example,Telegram and Discord channels where emerging trends may surface first.
The conversation concludes with Tiffany and Phil sharing an important message of support for the digital well-being of moderators. Day in and day out their role is centered around being exposed to toxic and traumatic content. It is extremely important for employers to support the trust and safety team by understanding their needs and providing them with specialized resources.
In addition, with global layoffs in the tech industry, along with the constant emergence of new threats and the arrival of a whole new set of Web 3.0 governance, the next wave of industry innovation will bring new challenges, but also new opportunities to get things right.
Watch the full interview here or listen and subscribe to the Brand Safety Exchange podcast on Apple Podcasts and Spotify.