WASHINGTON, November 10, 2021 – Some experts say they are concerned about a lack of diversity in content moderation practices across the technology industry because some companies may not be well-served – and could be negatively affected – by uniform policies.
Many say following what other influential platforms do, like banning accounts, could do more harm than good when it comes to protecting free speech on the internet.
Since former President Donald Trump was banned from Twitter and Facebook for allegedly stoking the January Capitol riot, debate has raged about what Big Tech platforms should do when certain accounts cross the generally protected free speech line into promoting violence, disobedience, or other illegal behavior.
But the Knight Foundation event on November 2 heard that standardized content moderation policies imply a one-size fits all approach that would work across the tech spectrum. In fact, experts say, it won’t.
Lawmakers have been calling for commitments from social media companies to agree to content and platform policies, including increasing protections for minors online. But representatives from Snapchat, TikTok, and YouTube who sat before members of the Senate Commerce Subcommittee on Consumer Protection last month did not commit to that.
Facebook itself has an Oversight Board that is independent of the company; the Board earlier this year upheld Trump’s ban from the platform but recommended the company set a standard for the penalty (Trump was banned indefinitely).
Among proposed solutions for many platforms is a move toward decentralized content regulation with more delegation of moderation to individuals that are not employed by the platforms. There are even suggestions of incentivizing immunity from certain antitrust regulation should platforms implement decentralized structures.
Costs of content moderation
At an Information Technology and Innovation Foundation event on Tuesday, experts suggested a level of decentralization that would involve user tools, as opposed to plowing money to employ content moderators.
Experts noted the expense of hiring content moderators. With global social media platforms, employees who are able to moderate content in all languages and dialects must be hired, and the accumulation of these hiring costs have the potential to be lethal to many platforms.