The world of social media is huge — 4.2 billion social media-sharing Earthlings huge1. Each has a voice of their own, using text, image, video, audio, or whichever combinations they feel most comfortable with. And each harbors their own beliefs regarding how they ought to behave in a digital world.
That’s why there’s a need for guardrails of content moderation (CoMo) on this global winding road of social sharing.
The growth curve of daily content volume is staggering. More than 500 hours of video content is uploaded to YouTube every minute,2 more than 95 million photos and videos are uploaded to Instagram each day3 and over 500 million tweets are tweeted per day.4
What does it take to intelligently moderate these massive number of users? We’ve identified 3 Big Factors.
First Big Factor: Moderating digital content requires humanity.
So, just how does bringing our humanity to the task of content moderation make a difference?
Technology often reflects the views of those who build it. The collective view of social preferences may lead to biases while designing algorithms. More and more, we’re learning that the most effective CoMo takes advantage of making digital human. Experienced human moderators work in concert with moderation driven by AI. In the years ahead, this human/tech synergy is going to be the gold standard for content moderation.
Content diversity and complexity are growing exponentially. So much so that it’s increasingly difficult to train and configure artificial intelligence to manage the torrents of content cascading across multiple media platforms without necessary human oversight.
That said, AI-automated moderation can be highly effective when decisions are rule based, such as identifying and flagging profanity, some pornography, common hate speech, certain abusive phrases, etc. While we wait for AI to close this comprehension gap, human moderators will remain in the driver’s seat. Trained professionals can study user generated content at a deeper level, considering intent, context, nuance, idiom, cultural norms and all the other subtle shades that make human communication, well, human.
Second Big Factor: Deliver globally. Speak locally.
With so many social content consumers and creators spanning the globe, it’s obvious that a one-size-fits-all moderation system isn’t feasible. There are simply too many different languages, cultures and idioms to moderate from a “CoMo Central.”
A global delivery model that leverages the talents of moderators who can speak the languages of the geographies they cover appears to be the most effective solution to ensure accurate and appropriate moderation. Locally versed moderators are better able to inherently understand nuances specific to their regions’ cultures. A successful global content moderating strategy will likely involve partnering with a global service provider who can deliver customized solutions focused on business and geographical needs.
Another factor to consider is regulatory oversight that may require greater agility in modulating local user behavior. For instance, local content moderation could involve case-specific judgment between local laws and what’s delineated in the Universal Declaration of Human Rights. It’s a delicate balancing act.
Third Big Factor: Finesse, feel and focus — now more than ever.
The social media landscape continually shifts under our feet. We’ve moved well beyond text-only communication into new territories.
Video is becoming the dominant medium, comprising more than 80 percent of consumer internet data traffic.5
The top five gaming sessions on Twitch have livestreamed almost 13 million hours of viewing monthly.6
Live streaming exploded almost 100 percent between April 2019 and April 2020.7
If there’s a will to communicate, media will find new ways. Unfortunately, that includes delivering people’s opinions about the truth: fake news. It takes an extraordinary level of finesse from moderators supported by AI to navigate through these previously uncharted media waters.
Mixed media can enhance our online lives, or do the opposite. That’s why it’s imperative that content moderators apply an extra measure of nuance and sound judgment when deciding what goes and what stays. It may sound simplistic to call this “feel,” but often that’s what’s required.
Obviously with so many grey areas to deal with, CoMo can be a high-stress field, often with individual moderators left out on an “island” with minimal support when making what are quite often snap decisions. That’s why maintaining the mental and physical well-being of content moderators has become a top priority for social media enterprises and their third-party CoMo providers.
Maintaining a safe and enjoyable online experience for billions of users is no mean feat. It’s going to demand a relentless focus on the part of the moderation community to not just keep pace with emerging media trends but also stay one step ahead. After all, in the word of social media, the one constant is that it never stays still.
Want to know more? We’d love to talk.
1 Datareportal.com 2021 https://datareportal.com/social-media-users
2 Statista 2019 https://www.statista.com/statistics/259477/hours-of-video-uploaded-to-youtube-every-minute/
3 Internet Live Stats via 99 Firms 2021 https://99firms.com/blog/instagram-marketing-statistics/#gref
4 Internet Live Stats 2021 https://www.internetlivestats.com/twitter-statistics/
5 Internet Digital/Futuresource 2021 https://www.streamingmedia.com/Articles/ReadArticle.aspx?ArticleID=144177
6 Streamhatchet 2021 https://streamhatchet.com/2021/03/31/top-live-stream-sessions-march-2021/
7 Statista 2021 https://www.statista.com/statistics/1030795/hours-watched-streamlabs-platform/