Arturo Bejar, a former employee of Meta, testified before a US Senate subcommittee, shedding light on the social media giant’s awareness of harassment and other challenges faced by teens on its platforms.
Bejar, who worked on well-being for Instagram from 2019 to 2021 and previously held the position of director of engineering for Facebook’s Protect and Care team from 2009 to 2015, made these revelations during a hearing on social media’s impact on teen mental health.
In his testimony, Bejar emphasized the need for transparency and tools that empower young users to report and combat online abuse. He stated, “It’s time that the public and parents understand the true level of harm posed by these ‘products’ and it’s time that young users have the tools to report and suppress online abuse.”
Bejar’s disclosures coincide with a bipartisan effort in Congress to pass legislation that would require social media platforms to provide parents with tools to protect children online.
During his tenure at Meta, Bejar’s role focused on shaping the design of Facebook and Instagram to encourage positive behaviors among users and provide resources for young individuals to manage unpleasant online experiences.
Meta responded to Bejar’s testimony with a commitment to safeguarding young people online. The company highlighted its support for user surveys, similar to the ones referenced by Bejar, and the creation of tools such as anonymous notifications to report potentially hurtful content. The statement from Meta emphasized ongoing efforts to ensure the safety of young users on their platforms.
Bejar revealed that he had regular meetings with senior executives at Meta, including Chief Executive Mark Zuckerberg, and initially believed that they were supportive of the work. However, he later concluded that the company’s leadership had consistently chosen not to address the issue of teen harm online.
In a 2021 email shared with Zuckerberg and other top executives, Bejar presented internal data indicating that 51% of Instagram users had reported negative experiences on the platform within the past seven days. Notably, 24.4% of users aged 13-15 reported receiving unwanted sexual advances. Additional data revealed that 13% of all 13-to-15-year-old Instagram users surveyed had experienced unwanted advances.
Bejar also recounted a personal experience involving his 16-year-old daughter, who had encountered misogynistic comments and inappropriate content on Instagram, without sufficient tools to report the incidents to the company. The existence of this email was first reported by the Wall Street Journal.
During his testimony, Bejar shared his disappointment, noting that Meta’s Chief Product Officer, Chris Cox, had detailed statistics on teen harms at his fingertips. Bejar found it disheartening because it implied that the company was aware of the issues but failed to take substantial action.
Bejar’s revelations have prompted discussion among senators who are supporting the Kids Online Safety Act, indicating that Meta executives may have overlooked the harm inflicted on young users of the company’s platforms. The hearing underscores the pressing need for accountability and measures to protect teens in the digital realm.