Robust Trust & Safety programs are the future

Anyone who has ever been on the receiving end of public community negativity knows that it feels terrible. Beyond that, depending on the situation, it can also feel terrifying. The people working on the front lines of Content Moderation and Trust & Safety are vanguards of protection to those who choose to make lives on the internet.

Many people who choose to work in Content Moderation or Trust & Safety have seen things they probably wish they could unsee. But those same people are also experts on best practices to handle both the situations and the humans involved in them. This training gives them the toolset to move forward with care and protect themselves and their community members from harm. In an increasingly digital world this work is massively important—and these people are best equipped to lead us forward safely. Let's talk a bit more about what community moderation entails, how we at ParterHero approach it, and where moderation is headed. 

What is Trust & Safety?

Trust & Safety is an umbrella term that includes everything from what you typically think of as content moderation (flagging or removing content that does not meet your company guidelines) to dealing with crimes that are happening in the real world that may have started with an online interaction (abuse from an Uber driver, for example). This team works in both a reactive and proactive way to ensure that they keep customers safe, increasing both trust and loyalty. For instance, if working proactively, it may be to protect the platform or service from bad actors by encouraging 2FA or documenting best practices for safety against getting hacked. Reactively, this team is most effective in helping customers regain account access or working against active hacking attempts.

What is Content Moderation?

Content moderation is the act of ensuring that content is consumable and appropriate for the intended audience. Many companies nowadays are making efforts to ensure that any public content associated with their brand—whether directly from them or consumers—is aligned with company policies. There is also often a legal component to this moderation; companies protect themselves by taking legal action if someone breaks content policies.

The one thing that isn't defined is what level of moderation a company takes up. Often, it's dependent on the audience and the platform. Some companies, like Parler, choose to offer a lightweight moderation option like having product users police each other. Others may decide to enforce policies disallowing advertisement of third-party services or may take a stance on violent language and imagery or other graphic content.

The approach of how to do content moderation and to what extent depends entirely on the company. We work with each of our partners to implement their policies. That being said, we do have staff who have been working in this industry for a long time (including, for example, Robyn Barton and Jake Dockter who helped lead Airbnb’s efforts around community safety) and can certainly give policy guidance and recommendations when useful. In the world of content moderation there is no universal standard—though maybe there should be.

Why are Content Moderation and Trust & Safety important?

While both of these teams have the obvious role of protecting the company against unwanted litigation, their utmost importance is in the customer's realm. Their most important (and valuable!) function is in keeping your users safe. Without these teams, the risk posed to your users can be extreme. Not just in the sense that they may see something that hurts their feelings or is "fake news," but the harm may even go so far as actual malware, international espionage activities, or even human trafficking messages. 

When malicious content or bad actors are present in a community without moderation (again, think Parler), the results can be disastrous. Messages and viruses like the ones mentioned above can spread or even encourage the posting of similar content. The ripple effect of just one piece of content like this making its way to a platform can be surprisingly far-reaching.

If a community is safe and protected, more users will engage. As you continue to enforce content moderation and your Trust & Safety team works to make policies even more robust, you will solidify the type of community engagement that will make your community bloom. Good community, healthy and happy (and safe) customers, and excellent engagement boost loyalty and ongoing customer health. It feels good for you, it feels good for your customers, and it's good for your bottom-line too.

How PartnerHero does it differently

At PartnerHero, we take our core values very seriously—and we commit to and uphold them with each of our partners. Here's how we do that with content moderation and Trust & Safety:

  • Manifest Trust. Our partners and users believe in us to moderate well and keep everyone involved safe. That is an immense amount of trust to place in a BPO. We want to cultivate that trust and merit it. We do our best to ensure that everyone on the platform feels safe and protected, and our partners can continue to put their trust in us not only because we are doing a good job moderating but also by helping their team and culture get better.
  • Care for Others. It's in our company's blood to care about people, and that extends to our moderating. Our approach considers that if we aren't moderating correctly, we are inflicting harm: on our users, our partners, and PartnerHero as a whole. So, we approach our moderation through a lens of care. By caring holistically about everyone's experience, we take moderation to the next level.
  • Take Ownership. We like to get things done, and our partners rely on us to do so. Because of that, we are deeply committed to doing high-quality work and being super efficient with our time. It's a point of pride for us. So, when we take on moderation or Trust & Safety for a partner, we do it with all of our hearts. 

Given our values, you can probably see why we've made an effort to prioritize content moderation and T&S within our broader customer experience practice. We want to make sure that we can provide a full-stack solution across the board. It's essential as our partners create new products and communities and engage their users in exciting and creative ways.

We've been very fortunate to learn from the excellent work that many of our partners are doing. We have a ton of exposure to different industries and support models and have gotten to see content moderation and T&S done well...or not done so well. Beyond the experience that we gain from working with our partners, we also have tons of internal leaders with years of experience to help coach partners build effective policies. Every day, we continue to learn and grow alongside our partners. We become better at supporting them and assisting in expansion and are integral in the shifting and growth of their community strategies. We invest and embed ourselves in processes far more than traditional BPO customer success work.

What is the future of content moderation?

Traditionally, T&S and content moderation have been very secretive, hidden parts of online work. While everyone talks about customer experience, few people actively discuss the people keeping us safe. That said, tech companies, the broader public and politicians are suddenly paying attention to how disinformation, harassment and violence online can lead to real-world consequences. Consider, for instance, Facebook/Instagram publicly sharing their content moderation data and models to delete “illegal” content automatically. Content moderation doesn't just impact businesses—it's about the everyday users. And those users want more moderation and safety. Globally 85% of respondents say they think tech needs more oversight and regulation

Many companies that allow users to share their opinions publicly have also had to take a stand and release Community Guidelines. Not only are they releasing these statements, but they are (sometimes) taking actions to enforce them that some people see as controversial or harsh. For instance, the backlash that Twitter and many other social media channels took after banning Trump. Despite these platforms being privately-held companies where users have agreed to their terms of service (and the community guidelines therein), there's still a lot of backlash on content moderation. There's a fine line of balance that we'll need to walk in the future to keep all of our users safe.

Gilad Edelman wrote, "what emerged this year was a new willingness to take action against certain types of content simply because it is false—expanding the categories of prohibited material and more aggressively enforcing the policies already on the books."

But how does that feel for the users that are getting banned? We may see the guidelines grow longer and more nuanced as content moderators and T&S teams try to keep people of all perspectives safe. 

There also seems to be a trend towards companies identifying what feels right or wrong. We have started to look not just at morals but what is deemed biased as well. For instance, more companies are beginning to pay attention to content that negatively portrays certain cultural, racial, and religious groups. We may also start to see separate communities, like Parler or 4Chan, for individuals who want less moderation or even people moving to completely encrypted platforms like Signal. 

Beyond the communities themselves' structure, we will likely start to see a high mix of human and software-based moderation. Specifically, we'll see a greater emphasis on doing human work with sustainability and the moderators' mental health in mind. For many years, the level of churn and burnout amongst content moderators have been astronomical. With the drive towards human connection and valuing the work that humans can do over machines, we will no longer treat this role as a disposable function whose health is unimportant. This is especially true as content moderation and T&S become more critical and valuable to companies.

Content moderation and user safety are the future. Are you onboard?

PartnerHero