A day later The Wall Street Journal published a blockbuster about Meta’s own dismal findings about teenage girls’ mental health on Instagram, CEO Mark Zuckerberg wondered if Meta should change the way it studies the potential harms of its platforms.
“Recent events have made me wonder if we should change our approach to research and analysis of social issues,” Zuckerberg wrote in a Sept. 15, 2021 email to top executives, including then-COO Sheryl Sandberg and global affairs chief Nick Clegg. the day before diary published a story based on documents obtained from a whistleblower, later revealed to be Francis Haugen, which revealed that the company’s own research found that “32 percent of teenage girls said that when they felt bad about their bodies, Instagram made them feel worse.” The subject line of Zuckerberg’s email read: “Research and Analysis of Social Issues – Privileged and Confidential.”
The 2021 email was unsealed Thursday after it was discovered by New Mexico Attorney General Raúl Torrez as part of a case alleging Meta falsely marketed its products as safe for teens when it was aware of the harmful designs the state says addicted children and allowed child predators to thrive. The AG’s office argued in the complaint that disclosing the damages Meta identified on its platforms “would remedy the misleading and deceptive nature of its public statements that its platforms are ‘secure,'” Meta spokesman Andy Stone said. The Verge in a statement that the company “is proud of our ongoing commitment to transparent, industry-leading research. As we have been for years, we continue to use these insights to make meaningful improvements, such as introducing teen accounts with built-in protection and giving parents a tool to manage their teen’s experience.”
The email is just one example of the kind of internal conversations that are expected to come to light during this lawsuit and in a number of cases with similar claims in California. Opening statements in the New Mexico case are expected to begin next week.
In an email, Zuckerberg wrote that Mete’s colleagues appeared to have managed to sidestep public criticism of social issues by doing far less proactive research into the harms of their platforms. “Apple, for example, doesn’t seem to be studying any of this,” he wrote. “As I understand it, they don’t have anyone to control or moderate the content, and they don’t even have a message flow in iMessage. They took the approach that it’s people’s own responsibility for what they do on the platform, and by Apple not taking that responsibility, they didn’t create a team or a plethora of studies to look at compromises in their approach. It’s worked surprisingly well for them.”
“When Apple tried to do something with CSAM, they were heavily criticized for it”
While Apple appeared to avoid criticism, according to Zuckerberg Met, it instead “faced more criticism” for reporting more child sexual abuse material (CSAM), which “seems like there’s more of that behavior on our platforms.” On the other hand, he noted, “When Apple tried to do something with CSAM, they were heavily criticized for it, which may encourage them to double down on their original approach.” Zuckerberg may have been referring to Apple’s announcement earlier that year to introduce new features designed to protect children, including scanning users’ iCloud photos for CSAM. But privacy advocates worried the move would create a giant backdoor for tracking user accounts. Apple later returned the proposals. Apple did not immediately respond to an email request for comment.
Apple and Meta have long clashed in both the public and private sectors over differing approaches to policy issues such as privacy and age verification. But Zuckerberg also had similar observations of other Meta peers. “YouTube, Twitter and Snap take a similar approach, to a lesser extent,” he wrote. “YouTube seems to be deliberately burying its head in the sand to stay under the radar and out of the limelight. Twitter and Snap may not have the resources to do this kind of research.” Many of the platforms have publicly shared their platform safety research and initiatives over the years, including YouTube’s Youth and Family Advisory Board of independent experts to guide teen well-being on the platform, as well as Snap’s Digital Well-Being Index (launched in 2022).
“I think we should be praised for the work we do to study, understand and improve social issues on our platforms”
Zuckerberg seemed to believe that the public reaction to his internal research was unfair. “I think we should be commended for the work we do to study, understand and improve social issues on our platforms,” he wrote. “Unfortunately, the media is more likely to use any research or recommendation to say we’re not doing all we can (implying this for cowardly purposes) than to take these problems more seriously than anyone else in our field by studying them and finding solutions, not all of which are reasonable to implement because everything has trade-offs.”
In response to this email, at least a few senior officials supported the continuation of some level of research on social issues, despite the risks of public perception. “Leaks suck and will continue to happen unless we find a way to eradicate them,” wrote then-VP of Central Products Javier Olivan. “That being said — is there still value in trying to understand these issues? I think it’s a responsible thing to do / I’d like to see us continue to try to understand how we can make our products better for everyone, but maybe we should scratch the surface of those areas where we at least see some clear degree of correlation between the use of our products / a particular problem.” Then-Vice President of Product, Selection and Competition David Ginsberg said that “after a lot of wrestling with this myself over the last few days” he largely agreed with Olivano. “I think internal work is important to deliver a good product and a good user experience – separate and apart from any societal goals.”
A few days later, Guy Rosen, product manager for integrity, shared several potential options for changing the company’s internal and external research organization, including the pros and cons of each. Rosen wrote that it was only a “preliminary/discretionary exercise” to understand the “spectrum of possibilities.” These ranged from centralizing teams that research highly sensitive topics in an effort to better control access to materials, to the most extreme option of disbanding teams that research sensitive topics and outsourcing work as needed. Ultimately, executives recommended the least extreme option of centralizing research teams and planned to announce it shortly after Instagram CEO Adam Mosseri’s upcoming congressional testimony. Mosseri, newly added to the email thread, noted that “Reporting this after my testimony (sic) is worse than before and we talked about (it). It will leak and make it look like I’m hiding something.” Meta eventually announced the changes ahead of Mosseri’s testimony, saying it continues to study sensitive topics such as teen welfare.
In the opening email, Zuckerberg lamented that leaks of internal documents were making that job more difficult. “That may be part of why the rest of the industry has taken a different approach to these issues.”