Our Work to Help Provide Young People with Safe, Positive Experiences

Today, Meta CEO Mark Zuckerberg testified before the U.S. Senate Judiciary Committee alongside industry peers. The hearing focused on one of the technology industry’s most important challenges: keeping children safe online. Meta has spent more than a decade working on these issues and has developed more than 30 tools, features and resources to support teens and their parents. We have around 40,000 people overall working on safety and security, and we have invested over $20 billion since 2016. This includes around $5 billion in the last year alone.

Child exploitation is a horrific crime and online predators are determined criminals. We’ll continue to work diligently to fight this abhorrent behavior both on and off our platforms, and to support law enforcement in its efforts to arrest and prosecute the criminals behind it. In the below written testimony submitted to the Committee, Mark provided an overview of Meta’s longstanding investment in not only helping keep young people safe on its services, but also in developing and sharing technology to help protect teens across the many apps and websites they use. 

Mark also reaffirmed Meta’s support for federal legislation that supports teens and empowers parents online. Specifically, we support federal legislation that requires app stores to get parents’ approval whenever their teens under 16 download apps. This way parents can oversee and approve their teens’ online activity in one place and can help to ensure their teens are not accessing adult content or apps, or apps they just don’t want their teens to use.

In today’s hearing, Mark said,

“I don’t think that parents should have to upload an ID or prove that they’re a parent in every single app that their child uses. I think the right place to do this, and a place where it would be very easy to do this, would be in the App store itself. My understanding is that Apple and Google, or at least Apple, requires parental consent when a child [makes] a payment in the app, so it should be trivial to pass a law that requires them to make it so that parents have control any time a child downloads an app (…) The research we’ve done shows that the vast majority of parents want that, and that’s the type of legislation (…) that would make it a lot easier for parents.”

Mark empathized with attending families, saying,

“I’m sorry for everything you have all been through. No one should go through the things that your families have suffered, and this is why we invest so much and we are going to continue doing industry leading efforts to make sure no one has to go through the things your families have had to suffer.”

These are complex issues, but we’re optimistic we can continue to collaborate with lawmakers and our industry peers to help create safe, positive experiences for teens online. This work is never done, but it always has been — and will remain — our priority. 

HEARING BEFORE THE UNITED STATES SENATE
COMMITTEE ON THE JUDICIARY

January 31, 2024

Testimony of Mark Zuckerberg
Founder and Chief Executive Officer, Meta

I.  Introduction

Chairman Durbin, Ranking Member Graham, and members of the Committee: 

Every day, teenagers and young people go online to stay connected to their friends and family, find community, and get support. Teens do amazing things on our services. They use our apps to feel more connected, informed, and entertained, as well as to express themselves, create things, and explore their interests. Overall, teens tell us this is a positive part of their lives. But some still face challenges online, and we work hard to provide support and controls to reduce potential harms. 

Being a parent is one of the hardest jobs in the world. Technology gives us new ways to communicate with our kids and feel connected to their lives, but it can make parenting more complicated, too. It’s important to me that our services are positive for everyone who uses them. We’re focused on building controls to help parents navigate the reality of raising kids today, including tools that enable them to be more involved in their kids’ decisions. 

We want teens to have safe, age-appropriate experiences on our apps, and we want to help parents manage those experiences. That’s why in the last 8 years we’ve introduced more than 30 different tools, resources, and features to help parents and teens. These include controls that let parents set limits on when and for how long their teen can use our services, see who they’re following, and know if they’ve reported anyone who might be bullying them. For teens, these tools include nudges that remind them when they’ve been using Instagram for a while or when it’s late and they might want to go to sleep, and the ability to hide words, topics, or people from their experience without those people finding out. 

With so much of our kids’ lives spent on mobile devices and social media, it’s important to ask and think about the effects on teens—especially on mental health and well-being. This is a critical issue, and we take it seriously. Mental health is a complex issue, and the existing body of scientific work has not shown a causal link between using social media and young people having worse mental health outcomes. A recent report from the National Academies of Sciences evaluated results from more than 300 studies and determined that the research “did not support the conclusion that social media causes changes in adolescent mental health at the population level.” It also suggested that social media can provide significant positive benefits when young people use it to express themselves, explore, and connect with others. We’ll continue to monitor research in this area and remain vigilant against any emerging risks. 

Keeping young people safe online has been a challenge since the start of the internet. As threats from criminals evolve, we have to evolve our defenses. We work closely with law enforcement to find and stop bad actors. Still, no matter how much we invest or how effective our tools are, this is an adversarial space. There is always more to learn and more improvements to make. We remain ready to work with members of this Committee, the industry, and parents to strengthen our services and make the internet safer for everyone. 

I’m proud of the work our teams have done to improve online child safety, not just on our services but across the entire internet. We have around 40,000 people overall working on safety and security, and we have invested over $20 billion since 2016. This includes around $5 billion in the last year alone. We’ve built and shared tools for removing bad content across the internet, and we look at a wide range of signals to detect problematic behavior. We go beyond legal requirements and use sophisticated technology to proactively seek out abusive material, and as a result, we find and report more inappropriate content than anyone else in the industry. As the National Center for Missing and Exploited Children (NCMEC) put it just this week, Meta goes “above and beyond to make sure that there are no portions of their network where this type of activity occurs.” 

I hope we can have a substantive discussion that drives improvements across the industry, including new legislation that delivers what parents say they want most: a clear system for age verification and parental control over what apps their kids are using. For example, 3 out of 4 parents favor introducing app store age verification, and 4 out of 5 parents want legislation requiring app stores to get parental approval whenever teens download apps. We support this. Parents of teens under 16 should have the final say on what apps are appropriate for their children, and this approach would leverage the parental approval system for purchases that app stores already provide today, so there’d be no need for parents and teens to share a government ID or other personal information with every one of the thousands of apps out there. We’re also in favor of setting industry standards on age-appropriate content and limiting signals for advertising to teens to age and location, not behavior. We’re ready to work with any member of this Committee who wants to discuss legislation in these areas and any of our peers across the industry to help move this forward. 

II.  Our Work 

Teen well-being and child safety are extremely important to us. We have many teams dedicated to these issues, and we lead the industry in a lot of the areas we’re here to discuss. 

We’ve built more than 30 tools, resources, and features to help protect teens and give parents oversight and control over how teens are using our services, including: 

  • Parental supervision tools, which let teens or their parents set daily limits for the total time that teens can spend on Instagram, Facebook, Messenger, Quest, and Horizon. Teens and parents can also set scheduled breaks that block access during specific hours of the day, such as during school or dinner time. So far, over 90% of U.S. teens are still using daily limits 30 days after initial adoption.
  • Take A Break notifications, which show full-screen reminders to leave the Instagram app. 
  • Prompting teens to turn on Quiet Mode, which turns off notifications and auto-replies to messages if they’re on the app for a specific amount of time at night. 
  • Nudges, which include alerts that notify teens that it might be time to look at something different if they’ve been scrolling on the same topic for a while, or that it’s getting late and might be time to close the app for the night.
  • Age verification technology on Instagram to confirm a teen’s age when they change their birthday from under 18 to over 18. 

We also provide special protection for teen accounts: 

  • Accounts for people under 16 (or under 18 in certain countries) are defaulted to private, so teens can control who sees or responds to their content. 
  • Teens are defaulted into the most restrictive content and recommendations settings to make it more difficult to come across potentially sensitive content or accounts. 99% of teens who are defaulted globally and in the U.S. are still using this setting a year later.
  • We recently announced additional steps to help protect teens from unwanted contact, turning off their ability to receive DMs from anyone they don’t follow or aren’t connected to on Instagram—including other teens—by default.
  • We prompt teens to review and restrict their privacy settings. 
  • We offer the option to hide like counts, so people don’t have to show others like counts on their own posts or see likes on other people’s posts.

In addition to these teen-specific protections, we hide results for searches for terms related to suicide, self-harm, and eating disorders, instead offering access to expert resources for everyone on Instagram. 

Parents and guardians know what’s best for their teens, so we also make it easy for them to be involved in their teens’ online experiences with supervision tools and expert-backed resources: 

  • Parents can decide when, and for how long, their teens use Instagram, see who their teens are following, and receive reports when they block someone or report something. 
  • On Facebook, parents can see insights like time spent, schedule breaks for their teens, and access expert resources on managing their teens’ time online.
  • Over 90% of guardians and teens in the U.S. who choose supervision experiences on Facebook or Instagram are still using them 30 days after initial adoption.
  • We’ve implemented similar parental supervision tools across our apps. 

We’ve built tools and policies specifically to help young people manage interactions with adults:

  • As noted above, we turn off teens’ ability to receive messages from anyone they don’t follow or aren’t connected to on Instagram by default. If a teen is already connected with a potentially suspicious adult, we send the teen a safety notice.
  • We restrict adults over the age of 19 from messaging teens who don’t follow them, and we limit the type and number of direct messages people can send to someone who doesn’t follow them to one text-only message.
  • We use prompts or safety notices to encourage teens to be cautious in conversations with adults they’re already connected to, and give them an option to end the conversation, or to block, report, or restrict the adult. 
  • We’ve made it easier to report content with a new dedicated option to prioritize a report if it “involves a child” on Facebook and Instagram.

We build technology specifically to help tackle some of the most serious online risks, and we share it to help our whole industry get better: 

  • We built technology behind Project Lantern, the only program that allows apps to share data about people who break child safety rules. 
  • We were a founding member of Take It Down, the service that enables young people to prevent their nude images from being spread online. This is an important tool that a teen can use to protect against the threat of sextortion. 
  • In 2020, we joined Google, Microsoft, and 15 other member companies of the Technology Coalition to launch Project Protect, a plan to combat online child sexual abuse. 
  • We work closely with safety advisors and professionals, as well as leading online safety nonprofits and NGOs to combat child sexual exploitation and aid its victims. 
  • We’ve partnered with child-safety organizations and academic researchers to complete child-safety research that has helped move the industry forward. For example, we recently partnered with the Center for Open Science on a pilot program to share privacy-preserving social media data with academic researchers to study well-being. 

We also work to find, remove, and report child sexual abuse material and disrupt the networks of criminals behind it: 

  • We developed technology that identifies potentially suspicious adults, reviewing over 60 signals to proactively find and restrict potential predators. We deploy machine learning to proactively detect accounts engaged in certain suspicious patterns of behavior by analyzing dozens of combinations of metadata and public signals, such as if a teen blocks or reports an adult.
  • When we identify these accounts, we limit their ability to find, follow, or interact with teens or each other, and we automatically remove them if they exhibit a number of these signals. 
  • As required by law, we report all apparent instances of child exploitation identified on our site from anywhere in the world to NCMEC, which coordinates with law enforcement authorities from around the world.
  • We respond to valid law enforcement requests for information with data, including email addresses and phone numbers, and traffic data, like IP addresses, that can be used in criminal investigations. We provide operational guidelines to law enforcement who seek records from Facebook or Instagram.
  • Between 2020 and 2023, our teams disrupted 37 abusive networks and removed nearly 200,000 accounts associated with those networks.
  • In Q3 2023, we removed 16.9 million pieces of child sexual exploitation content on Facebook and 1.6 million pieces on Instagram. 
  • In Q3 2023, of the child sexual exploitation content we actioned, we detected 99% on Facebook and 96% on Instagram before it was reported by our users. 

III.  Our Commitment

We want everyone who uses our services to have safe, positive, and age-appropriate experiences, and we approach all our work on child safety and teen mental health with this in mind. We build comprehensive controls into our services, we work with parents, experts, and teens to get their input, and we engage with Congress about what else needs to be done. 

We’re committed to protecting young people from abuse on our services, but this is an ongoing challenge. As we improve defenses in one area, criminals shift their tactics, and we have to come up with new responses. We’ll continue working with parents, experts, industry peers, and Congress to try to improve child safety, not just on our services, but across the internet as a whole. 

That goes for our work on youth well-being and mental health, too. We’ll continue to study this ourselves, monitor external studies, and open up our data for academic researchers, and we’ll keep working on additional tools and resources that give parents and teens more control over their experiences online. I look forward to discussing these important issues with you today.

The post Our Work to Help Provide Young People with Safe, Positive Experiences appeared first on Meta.



source https://about.fb.com/news/2024/01/our-work-to-help-provide-young-people-with-safe-positive-experiences/


Top rated Digital marketing. From $30 Business growth strategy Hello! I am Sam, a Facebook blueprint certified marketer. Expert in Facebook Ads, Instagram Ads, Google Ads, YouTube Ads, and SEO. I use SEMrush and other tools for data-driven research. I can build million-dollar marketing strategy for your business.
Learn more
Reactions

Post a Comment

0 Comments