Meta’s New Content Strategy Could Unleash Chaos and Hate Speech Online
As the digital landscape continues to evolve, the topic of misinformation and the regulation of online content has never been more pressing. On January 15, 2025, CommPRO’s Publisher, Fay Shapiro, will host a pivotal Communications Town Hall, with a star-studded lineup of industry experts to discuss Meta's controversial decision to end its third-party fact-checking program and shift to a crowdsourced moderation system. Among the featured guests is Neil Foote, founder and CEO of Foote Communications, a full-service public relations, marketing, and communications-consulting firm that specializes in public relations, social media strategies, public affairs, multicultural marketing, and content management.
In advance of the event, we sat down with Neil to get his insights on this timely and crucial topic.
The Implications of Meta’s New Moderation Approach
Meta's decision to roll back its third-party fact-checking initiative has been met with concern from many corners of the communications and social media world. With this shift, Meta will now rely more heavily on user-generated content to flag and moderate misinformation. While some argue that crowdsourced moderation could lead to more democratic decision-making, Neil Foote raises serious concerns about its potential dangers.
"Free expression is great," Neil begins, acknowledging the value of open communication. "But free expression without any fact-checking translates into the potential for unleashing hate speech and trolling at record levels."
He points to the history of Facebook, which originally implemented fact-checking measures after users expressed frustration over the rampant spread of hate speech and misinformation on the platform. According to Foote, Meta's decision to replace third-party fact-checkers with crowdsourced moderation could have significant ramifications on the safety and integrity of the content shared on the platform.
“Zuckerberg seems to believe in the goodness and kindness of humanity to protect each other,” Neil notes. “That’s a naive approach in a world where demeaning people, or creating fictional narratives based on political talking points, is becoming the norm.”
This skepticism stems from Foote's view that the existing flaws in Meta’s moderation system—such as accounts being wrongfully flagged for certain keywords or tones—could be exacerbated under the new crowdsourced approach. He compares it to the early days of Wikipedia, which allowed users to edit content freely, yet was often riddled with inaccuracies. “The aggressive approach Wikipedia employs to challenge publishers for third-party validation is one way it has balanced ‘free expression’ and fact-checking,” Foote explains.
The Intersection of Free Expression and Public Safety
As the Communications Town Hall discussion will no doubt highlight, the delicate balance between protecting free speech and ensuring the safety and accuracy of content shared online is at the forefront of the conversation. Neil warns that Meta’s decision to roll back its fact-checking program and reduce its focus on diversity, equity, and inclusion (DEI) initiatives could further contribute to the divisiveness already apparent on social media platforms.
“Meta’s policy change comes at a time when their commitment to DEI is already under scrutiny,” Foote observes. “The unfortunate timing sends a message that Facebook may now be embracing a less inclusive, more divisive space—one that could have dire consequences for the platform’s reputation.”
His remarks underscore an ongoing concern: As social media platforms like Facebook continue to grow, the balance between ensuring open dialogue and curbing the spread of harmful, false content becomes more complex and more critical. Neil questions whether it’s even possible to foster harmony between voices advocating for diverse causes and those seeking to propagate hate.
What’s Next for Meta and the Digital Landscape?
With millions of users leaving platforms like X in recent months due to dissatisfaction with how their content is being managed, Meta’s new approach to content moderation may be setting the stage for its own challenges. Foote suggests that Meta could find itself back at square one, wrestling with similar issues it faced in 2017—looking for a better solution.
"I’m afraid Facebook may find itself back at square one, 2017 trying to find a better solution," Foote concludes. "If it hasn’t lost millions of users like X has over the past several months, Zuckerberg’s video and his current strategy could come off as tone-deaf."
This Town Hall promises to be an insightful and thought-provoking event as experts from across the communications field weigh in on Meta's evolving approach to content moderation, and the broader implications for social media platforms and society at large.
Join the Conversation
The Communications Town Hall will take place on January 15, 2025, from 11:00 AM to 12:00 PM ET. Hosted by Fay Shapiro, this session will feature a range of prominent thought leaders, including Jason Damata, CEO & Founder of Fabric Media; Nati Katz, VP of Strategic Communications at Futurum; Dominic Calabrese, contributing editor at CommPRO; Helio Fred Garcia, President of Logos Consulting Group; Linda Descano, Global Chief Integration & Marketing Officer at Havas Red; Tiffany Guarnaccia, CEO & Founder of Kite Hill Public Relations; Johna Burke, CEO & Global Managing Director at AMEC; Rida Bint Fozi, President of The TASC Group; and Neil Foote, President of Foote Communications, LLC.
Don't miss this chance to engage in a high-level discussion about the future of digital content moderation and the challenges ahead for platforms like Meta.