Meta, the parent company of Facebook and Instagram, has announced a partnership with the Center for Open Science, a non-profit organization focused on increasing transparency in academic research, as per TechCrunch reporting. This collaboration comes just before a significant Congressional hearing addressing concerns about children’s online safety.
As part of the pilot program, Meta has committed to sharing “privacy-preserving social media data” with selected academic researchers. This initiative aims to deepen the understanding of how social media usage may affect users’ well-being, particularly among younger demographics. Curtiss Cobb, Meta’s VP of Research, emphasized the company’s dedication to contributing to scientific knowledge while ensuring user privacy.
The announcement is timely, considering the increasing urgency in Congress regarding the impact of social media on mental health. In November, Meta expanded access to its Meta Content Library, a transparency tool that allows researchers to analyze public data like posts, comments, and reactions more effectively.
This week, Meta founder and CEO Mark Zuckerberg is scheduled to testify before Congress, alongside executives from other major social platforms, on the topic of children’s online safety. In anticipation of this, Meta has introduced new messaging restrictions on Facebook and Instagram, designed to protect users under 16 from unsolicited adult interactions. The company has also implemented controls to limit exposure to harmful content related to self-harm, suicide, and eating disorders.
Escalating scrutiny and industry-wide safety initiatives
Meta’s latest initiatives arrive amid heightened scrutiny over its historical approach to child safety on its platforms. Unredacted documents from an ongoing lawsuit have revealed concerns about Meta’s commitment to protecting young users.
In response to the growing focus on online safety, other social platforms are also ramping up their efforts. X, for instance, is expanding its “Trust and Safety” team and engaging in discussions with Congress about handling child sexual exploitation. The company recently faced a significant moderation challenge with the spread of nonconsensual, pornographic deepfake images of Taylor Swift, sparking public outcry and calls for legislative action from the White House.
Source link