Our current media ecosystem is facing critical trust and safety issues. For it to aspire to healthy and sustainable business growth, while continuing to preserve the safety of its online communities, consistent industry standards are necessary.
Matt Soeth serves as the Head of Trust and Safety at Spectrum Labs, a tech company whose mission is to make the Internet a safer and more valuable place for all through AI-powered technology for content moderation. He joins the Brand Safety Exchange to share his perspectives on how organizations can utilize those standards in their operating principles.
In this episode, we also welcome our newly appointed host and managing director of Oasis Consortium, Erica DeLorenzo. The pair discuss ways to favor better, more positive user interactions within digital platforms and games and the role of industry-wide collaboration in removing toxic content and bad actors at scale.
A Personal Journey From Protecting Students To Spearheading Industry Efforts
After spending time as a teacher in the early 2000s, Matt witnessed firsthand accounts of bullying and harassment behaviors on his students with the rapid rise of the smartphone and ensuing social media apps. It is reported that 37% of young people aged 12-17 have said they have been bullied online, with only 1 out of 10 victims informing a parent or trusted adult of their online abuse. This catalyzed Matt to launch the #ICANHELP non-profit to prevent cyberbullying and other forms of digital harassment through the education and promotion of student leadership. To date, the #ICANHELP movement has trained over 2 million students and educators to be a core part of the solution.
The outstanding success of this movement saw Matt join TikTok in 2020 as one of the earliest members of the Global Trust and Safety Team to help build the tools and resources necessary to monitor content and interaction on the fastest-growing social media of our time.
Eager for the next challenge, after setting up frameworks for TikTok, Matt wanted to convey his expertise across multiple platforms to maximize his impact. The next stage of his journey brought him to lead the Trust and Safety team at Spectrum Labs where currently he works across multiple platforms, from gaming to dating and social apps, advising them on policy ideas and best practices around educating users to respect those environments. Matt dubs it is as a “club for trust and safety professionals.” Spectrum Labs currently keeps 1 billion users safe online.
If Safety Standards Aren't Met, Profits Suffer Too
In the current landscape of the policy world, a lot of great conversations about what trust and safety online should look like are happening, but often there is a lack of accompanying resources and knowledge on “how do we do that?”
Matt’s work, along with the other professionals in this space, is to build those standards, in a simplified way where there are no barriers for organizations to implement solutions. From defining what the norms of ‘good online behavior’ are to building specialized content moderation teams and challenging built-in design choices, keeping communities safe on the web has to be an intentional effort.
An example analyzed on the Brand Safety Exchange podcast is the one of Twitter, where the recent layoffs after Elon Musk’s takeover have disproportionately affected the content moderators team, drastically reducing brand safety resources.
As Erica points out, profitability and maintaining high standards of online safety are interlinked aspects. During the layoff process, while Twitter’s service continued without disruption, its revenue operations plummeted \ to a 35% drop in revenue from the previous year. This is especially concerning when ads form 90% of the revenue. Major brands such as General Motors and General Mills pulled the plug on their ad spend after apprehensions around adequate online safety, especially after Elon Musk fired the entire Human Rights Team. Other platforms such as Facebook have also seen similar results when safety teams are cut.
“If a platform doesn't have good safety practices, no one is going to want to advertise there,” explained Matt. “As long as you have certain types of content on your platform, even credit card companies won't allow transactions to happen if certain safety standards aren't met.”
Advertisers will want to spend their dollars where users spend their time and therefore are looking for places making the best effort to keep people safe. According to the Pareto principle, a common business and economics phenomenon also known as the 80/20 rule, roughly 80% of outcomes can be traced back to 20% of the causes. This can be applied to any field and, indeed, the same dynamic represents itself among digital communities, Matt says, where usually the net majority of problems are caused by only a small percentage of the user base.
So, it comes down to putting in place the right tools from a user safety perspective and having dedicated teams. This includes thinking about how the workload could be automated, which technologies can drive efficiency, and how those teams could be empowered.
Working On The Trust Component Via A Collaborative Approach
Communicating those efforts to the final consumer is also an essential part of this process to foster loyalty, retention, and advocacy. Users now want to populate platforms that align with their values and are paying close attention to that. According to an IAB poll, 77% of respondents said that, as consumers, they look at brand safety as a key priority.
And this leads to a lot of the conversions going back to how brands can build trust again. Matt believes transparency is the answer. He advocates for coming to a common consensus of what problematic behaviors look like, how the industry can collectively define them, and how brands can communicate with greater clarity on the action taken to ensure their platforms remain accountable
A collaborative and cross-functional approach to the challenge is what could really make a difference. The gaming industry is a great example of where we are witnessing a conscious effort to look at how to promote positive interactions within the platforms, rewarding the good, and acknowledging the risks so that players can feel more connected and empowered during those experiences.
As we embark on a new stage of industry standards in the upcoming era of Web3 and increasingly more interconnected communities, it’s important for the biggest companies in the space to be very strategic in their implementation of both user and brand safety practices and policies. The best trust and safety approaches are constantly evolving and require investment in talent and expert business partners. TikTok, for example, is developing new initiatives with mental health experts that are specifically dedicated to reducing exposure to potentially harmful content around issues like suicide and eating disorders. This includes encouraging access to helpful resources for their audience, and implementing tools that can prevent the monetization of content that violates their guidelines and standards..
Matt foresees a multitude of obstacles that must be addressed in the future including the advancement of artificial intelligence and its associated ethical considerations, as well as ensuring inclusivity and safety as foundational aspects of design. The faith in the technology sector’s capability to incorporate these changes from the outset and embed them into the development process of new digital infrastructure is what holds the potential for improved industry standards.
Key Trust & Safety Lessons
- Understanding brand and user safety standards is key for business growth as advertisers and consumers increasingly choose to spend their time with platforms and companies that they can trust
- Communicating trust & safety efforts to both customers and clients is an essential part of a sustainable business model. Accountability required transparency.
- Keep up with the evolution of the Oasis User Safety Standards to ensure you have the best policies and solutions in place to tackle emerging issues.
Watch the full interview here or listen and subscribe to the Brand Safety Exchange podcast on Apple Podcasts and Spotify.