New top story from Time: From Instagram’s Toll on Teens to Unmoderated ‘Elite’ Users, Here’s a Break Down of the Wall Street Journal’s Facebook Revelations



A series of investigative reports being rolled out by the Wall Street Journal is putting a spotlight on the behind-the-scenes actions of Facebook. Ranging from rule exemptions for high-profile users to Instagram’s toll on teens’ mental health, the Journal‘s “Facebook Files” expose internal Facebook research that appears to show just how knowledgable the company was of the platform’s “ill-effects.”

The Journal’s reports, three so far, are based on a review of internal Facebook documents, including research reports, online employee discussions and drafts of presentations to senior management. They reveal that the company has allegedly failed to fix numerous problems that it’s long known about, and in some cases made them worse.
[time-brightcove not-tgx=”true”]

“Time and again, the documents show, Facebook’s researchers have identified the platform’s ill effects. Time and again, despite congressional hearings, its own pledges and numerous media exposés, the company didn’t fix them,” the Journal reports. “The documents offer perhaps the clearest picture thus far of how broadly Facebook’s problems are known inside the company, up to the chief executive himself.”

Read More: Facebook Banned a Hindu Extremist Group—Then Left Most of Its Pages Online for Months

Here’s what to know about the Journal‘s ongoing revelations about Facebook.

The XCheck program

Due to an internal program known as “cross check” or “XCheck,” high-profile Facebook users are exempt from some or all of the platform’s rules, according to the debut installment of the “Facebook Files.”

Published Monday, the Journal‘s inaugural report examines how the program, originally intended to quality-control actions taken against “elite” users like celebrities, politicians and journalists, has actually allowed these users to avoid moderation. Reportedly, those protected are either “whitelisted,” meaning they aren’t subject to enforcement actions from Facebook, or allowed to post rule-violating content pending subsequent review by Facebook employees. However, according to a 2019 internal document reviewed by the Journal, less than 10% of the posts flagged to XCheck were actually reviewed.

The program reportedly protected at least 5.8 million people as of 2020, including former President Donald Trump, Donald Trump Jr., Sen. Elizabeth Warren and Candace Owens, and has allowed misinformation, harassment, calls to violence and revenge porn to remain on the platform.

“We are not actually doing what we say we do publicly,” a 2019 internal review reportedly read. “Unlike the rest of our community, these people can violate our standards without any consequences.”

Read More: Facebook Says It Supports Internet Regulation. Here’s an Ambitious Proposal That Might Actually Make a Difference

Facebook spokesman Andy Stone told the Journal that criticism of XCheck is warranted and that Facebook has been working to address the problems with the program, which is meant to “accurately enforce policies on content that could require more understanding.”

Instagram’s toll on teens’ mental health

Facebook, which owns Instagram, has known for years that the photo sharing platform is harmful to the mental health of a significant percentage of young users, and particularly teen girls, according to a Journal report published Tuesday.

Citing internal studies that Facebook conducted over the past three years, the report details how the company downplayed its own research on negative effects that Instagram has on the millions of young people who make up nearly half of its user base, ranging from eating disorders to depression to suicidal thoughts. One internal Facebook presentation reviewed by the Journal showed that among teens who reported suicidal thoughts, 13% of British users and 6% of American users traced the issue to the photo-sharing app.

The report states that Facebook’s own research found that teenage girls are particularly susceptible to the mental toll of Instagram, where highly filtered and edited photos promoting unrealistic and often unattainable body standards run rampant. In a 2019 presentation, researchers said that Instagram makes body image issues worse for one in three teen girls. A presentation from the following March, reviewed by the Journal, reported that 32% of teen girls who felt bad about their bodies felt worse when they looked at Instagram.

“Comparisons on Instagram can change how young women view and describe themselves,” one slide read.

But that doesn’t mean that teen boys don’t also suffer mental health consequences from Instagram use. Facebook’s research reportedly showed that 40% of teen boys experience the effects of negative social comparison while using Instagram, while 14% of boys in the U.S. said Instagram made them feel worse about themselves.

Despite these findings, Facebook never made its research public or available to academics or lawmakers who asked for it, the Journal reported. Researchers also found that some of the features that play a key role in Instagram’s success and addictive nature, like the Explore page, which serves users curated posts based on their interests, are among the most harmful to young people. “Aspects of Instagram exacerbate each other to create a perfect storm,” the research reportedly said.

In response to a request for comment on the Journal‘s report, an Instagram spokesperson pointed TIME to a Tuesday blog post stating that while the story “focuses on a limited set of findings and casts them in a negative light,” the company stands by the research as it “demonstrates our commitment to understanding complex and difficult issues young people may struggle with, and informs all the work we do to help those experiencing these issues.”

Read More: Facebook’s New Zoom Competitor Adds Virtual Reality to Conference Calls. Here’s What It’s Like

The Journal goes on to say that Instagram’s research mirrors external research that “the effects of social media on people’s well-being is mixed” and that “[s]ocial media isn’t inherently good or bad for people. Many find it helpful one day, and problematic the next. What seems to matter most is how people use social media, and their state of mind when they use it.”

The Journal‘s report comes in the wake of Facebook reaffirming its intention to continue with the development of an Instagram for kids under 13 despite pressure from lawmakers to abandon the plan.

When asked about the connection between children’s declining mental health and social media platforms at a May congressional hearing, Facebook CEO Mark Zuckerberg said the research isn’t “conclusive.”

An algorithm that rewards divisive content

When Facebook overhauled its News Feed algorithm to promote “meaningful social interactions” in 2018, it instead resulted in a system that amplifies divisive content on the platform, the Journal reported Wednesday.

While Zuckerberg said the goal of the change was to “strengthen bonds between users and to improve their well-being,” Facebook staffers reportedly warned that the algorithm’s heavy weighting of reshared material was pushing polarizing and false content that drew outrage, and therefore more comments and reactions, to the forefront of users’ News Feeds.

“Misinformation, toxicity, and violent content are inordinately prevalent among reshares,” researchers said in an internal memo reviewed by the Journal.

Zuckerberg also reportedly resisted some proposed fixes to the issue, like eliminating a boost the algorithm gave to content most likely to be reshared by long chains of users, out of concern that it would lead to decreased user engagement.

In response, Lars Backstrom, a Facebook vice president of engineering, told the Journal that any algorithm can promote harmful content and that the company has an “integrity team” working to mitigate these issues. “Like any optimization, there’s going to be some ways that it gets exploited or taken advantage of,” he said.

Post a Comment

0 Comments