|
|
Another Meta whistleblower steps up For years, Meta’s Instagram platform has exposed its youngest and most vulnerable users to sexual exploitation, harassment, pro-eating disorder content, and a whole host of other harms. In 2021, whistleblower Frances Haugen revealed that Meta executives were fully aware that their platform design was making teens feel worse about themselves and cultivating body image issues amongst teen girls. When faced with that disturbing evidence, Meta executives didn’t act on it – they ignored it. Now, two years later, another Meta whistleblower has spoken out, and his revelations paint an even more damning picture of the company. According to former Meta engineer Arturo Béjar, Meta executives – including Mark Zuckerberg – are fully aware of how to make their platforms safer for kids, but they are choosing not to. In his November 7 testimony before Congress, Béjar shared how the Well-Being team at Meta collected and presented top executives with research and data indicating that the company’s rules, automated systems, and metrics for determining children's safety were simply inadequate and failing to keep kids safe. Mr. Béjar described the extent of intel that Meta was privy to and the design solutions proposed to fix these problems. Béjar gave his testimony not only as a whistleblower, but also as a concerned father. His own teenage daughter experienced unwanted sexual advances from users on Instagram, an experience he described to be deeply disturbing for both parent and child. Most shockingly, Mr. Béjar revealed that Adam Mosseri – the head of Instagram himself – acknowledged and agreed with the child safety issues that the data pointed to. And yet, Meta ultimately did not resolve any of these problems and continued to allow these online harms to persist. Meta is blatantly ignoring the evidence as to how their platform design is harming kids. Meta and other Big Tech companies have been playing by their own rules for far too long, and kids like Mr. Béjar’s daughter are getting hurt in the process. This is exactly why the Kids Online Safety Act (KOSA) needs to be passed to hold Big Tech accountable. We need a new rulebook for social media companies that creates a duty of care for the young people on their platforms in order to significantly mitigate these online harms. It’s time for Congress to act in the best interest of children and teens by passing KOSA! |
|
|
|
|
|
Shocking revelations from Meta lawsuit Last month, 42 attorneys general sued Meta alleging that Facebook and Instagram’s design and business practices harm young people. The Massachusetts Attorney General’s complaint says Meta leadership repeatedly declined to make changes and implement safety features that would protect kids and teens on Instagram. For example, internal research demonstrated that “plastic surgery” filters and like counts on posts harm young users. When staff asked Meta leadership to make changes to protect youth from these features, however, Mark Zuckerberg denied those requests, the complaint says. It also highlights the fact that Meta counts on young Instagram users to spend hours on the app every day. Massachusetts’ allegations reveal what we already know: when it comes to kids and teens, Meta will consistently put profits over wellbeing. |
|
|
|
Inspire young leaders Are you a professional working to improve the digital landscape? Consider applying to mentor a young digital wellness leader! NextGen Connect is a 12-week program designed to harness the expertise of Fairplay’s Action Network members and partners to support digital wellness projects of the young adults who have experienced digital stresses the most. As a mentor, you’ll gain first-hand understanding and influence in the next generation's ideas on tech accountability, advocacy and screen time education; have access to a peer group of other professionals; and gain exposure to other thought leaders in the tech accountability space. Youth applications are in, and now we need experienced mentors like you to join us. Email nextgen@fairplayforkids.org for more information. |
|
|
|
|
Avoid these toys! Last week, PIRG Education Fund released their 38th annual Trouble in Toyland report, which warns consumers about toys and smart devices marketed toward children. The report largely features the Meta Quest 3 headset, which is now rated "safe" for ten-year-olds, despite a lack of research supporting the switch. Fairplay's Rachel Franz contributed to the report, detailing her disturbing experiences with the VR headset. Rachel and our Campaign Director, David Monahan, each spoke at separate events about the report on Thursday, 11/16. |
|
|
|
Our latest and greatest Fairplay’s FY23 Annual Report is here! Check out the many highlights of our past year, including the progress we’ve made on our campaign to pass the Kids Online Safety Act and the work of the courageous survivor parents in our movement to hold Big Tech accountable for putting profits ahead of kids. The report also showcases the latest updates from our Screen Time Action Network and how our work led to the FTC laying down the law with tech giant Amazon. |
|
|
|
|
Screen-Free Week Each year, many friends of Fairplay celebrate Screen-Free Week as a chance to unplug and reconnect! But some people found that the event dates or style didn't work for them. This year, we’ve made some changes, allowing you to celebrate Screen-Free Week your way. In 2024, we're encouraging people around the globe to celebrate when they want, how they want... they can even change the name! Read more in our blog here. |
|
|
|
|
|
Contact Us Fairplay 89 South St., Suite 403 Boston, Massachusetts 02111 617-896-9368 info@fairplayforkids.org |
|
|
|
|
|