Bias by moderators in online chatrooms can be identified and fixed, new research finds


Bias by moderators in online chatrooms can be identified and fixed, new research finds

If you hoped to escape a deluge of partisan political promotion during the last days of this election season, your social media feeds probably won't provide much solace.

A Wall Street Journal report last week found "new X users with interests in topics such as crafts, sports and cooking are being blanketed with political content and fed a steady diet of posts that lean toward Donald Trump and that sow doubt about the integrity of the Nov. 5 election."

While leading political accounts on X - formerly known as Twitter - have seen their audiences dwindle before the election, a separate Washington Post analysis found some of their tweets are still "going mega-viral -- virtually all of them from Republicans." The Post also discovered that "Republicans have also seen huge spikes in follower counts over the Democrats, and their tweets have collectively received billions more views."

Those findings come as the University of Michigan released a new study last month probing how political bias in content moderation on social media feeds has the effect of creating so-called echo chambers.

Justin Huang, assistant professor of marketing at the University of Michigan Ross School of Business, and his collaborators, Ross School Ph.D. graduate Jangwon Choi and U-M graduate Yuqin Wan, looked at the social media site, Reddit, to explore how their chatrooms known as subreddits are overseen by moderators -- and the impact of decisions made by those moderators to remove certain content based on their own biases.

"The type of user-driven content moderation we study is present on all of the major social media platforms, including Facebook, TikTok, Instagram, YouTube and X," Huang said in an interview posted on the University's website. "These platforms give users ownership and moderation control over online spaces such as groups or the comment sections of content they create, and there are practically no platform guidelines or oversight on how a user moderates."

Huang said his research has documented political bias in user-driven content moderation. In other words, moderators in Reddit who remove comments that run afoul of their own political views.

"This bias creates echo chambers, online spaces characterized by homogeneity of opinion and insulation from opposing viewpoints," he said.

Researchers said the negative impact of this phenomenon can lead to a distorted view of what's normal.

In some cases, this could subvert democratic norms, they said, by radicalizing individuals, permitting misinformation and reducing trust in the outcomes of elections if they run counter to the skewed world created in online chat rooms.

Researchers said, however, there are potential solutions, including:

Previous articleNext article

POPULAR CATEGORY

industry

6774

fun

8630

health

6755

sports

8903