Testimony of this truthful and honest and complete. Let me ask you this, Joe Biden last year said that Xi Jinping was a dictator. Do you agree with Joe Biden as Xi Jinping a dictator?
Yeah, he was a 16 year old or Kenzen. After a breakup in 2022, he went on your platform and searched for things like inspirational quotes and positive affirmations. Instead, he was served up numerous videos glamorizing suicide until he killed himself by gun.
What about the name Chase Nasca? Does that ring a bell?
查斯·纳斯卡这个名字有没有让你想起些什么?
Would you mind giving me more details, please?
请问你介意给我更多的细节吗?
He was a 16 year old who saw more than a thousand videos on your platform about violence and suicide until he took his own life by stepping in front of a plane or train.
Christina Kefara was a paid advisor to ByteDance, your Communist-influenced parent company. She was then hired by the ByteDance FTC to advise on how to sue Mr. Zuckerberg's company.
Christina Kefara曾是字节跳动的有薪顾问,而字节跳动是你们受共产主义影响的母公司。后来,她被字节跳动联邦贸易委员会聘请,以提供关于如何起诉扎克伯格先生的公司的建议。
Senator, ByteDance is a global company and not a Chinese Communist Party company. Public reports indicate that.
参议员,字节跳动是一家全球公司,而不是中国共产党的公司。公开报道指出了这一点。
Public reports indicate that your lobbyists visited the White House more than 40 times in 2022. How many times did your company's lobbyists visit the White House last year?
公开报告显示,你们的游说者在2022年访问了白宫40多次。贵公司的游说者去年在白宫访问了多少次?
I don't know that.
我不知道那个。
Are you aware that the ByteDance campaign and the Democratic National Committee is on your platform? They have TikTok accounts?
你知道字节跳动公司和美国民主党全国委员会在你们的平台上有存在吗?他们在TikTok上都有账号。
A senator, we encourage people to come on to create. Which by the way, they won't let their staffers use their personal phones. They give them separate phones that they only use TikTok line. We encourage everyone to join including yourself, Senator.
So all these companies are being sued by the FTC. You're not.
所以所有这些公司都被联邦贸易委员会起诉了,而你没有。
The FTC has a former paid advisor, your parent, talking about how they consume Mr. Zuckerberg's company. Joe Biden's reelection campaign, the Democratic National Committee is on your platform.
Let me ask you, have you or anyone else at TikTok communicated with or coordinated with the Biden administration, the Biden campaign, or the Democratic National Committee to influence the flow of information on your platform?
We work with anyone, any creators who want to use our campaign. It's all the same process that we have.
我们与任何希望使用我们的营销活动的创作者合作。我们都有相同的流程。
So what we have here, we have a company that's a tool of the Chinese Communist Party that is poisoning the minds of America's children, in some cases driving them to suicide, and that at best the Biden administration is taking a pass on at worst, maybe in collaboration with.
So we're going to take a break now. When the second roll call members can take advantage of, they wish the break will last about 10 minutes. Please do your best to return.
Yes, Mr. President, I'm going to break you. Are you trying to go through? We're going to do something. I think these guys are also coming this way. Just to avoid that. Excuse us, sorry. Excuse us.
What's Washington? Time before the Senate clear it account. Can you just contact me?
华盛顿是什么意思?请在参议院核实账户之前联系我好吗?
Thank you, Mr. President. Thank you, Mr. President. Thank you so much. Just in wait here. Texting was good. That was fun. Yeah.
谢谢您,总统先生。非常感谢您。只是在这里等一下。发短信很好玩。很有意思。是的。
Senator Judiciary Committee will resume. We have nine senators who have not asked questions yet in seven minute rounds. And we'll turn first to Senator Padilla.
Thank you, Mr. Chair. Callies as we reconvene. I'm proud of once again to share that I am one of the few senators with younger children. And I lead with that because as we are having this conversation today, it's not lost on me that between my children who are all now in a teen and preteen categories. And their friends. I see this issue very up close and personal. And in that spirit, I want to take a second to just acknowledge and thank all the parents who are in the audience today, many of whom have shared their stories with our offices in a credit them for finding strength through their suffering, through their struggle, and challenging that into the advocacy that is making a difference. I thank all of you.
Now I appreciate, again, personally, the challenges that parents and caretakers, school personnel and others face in helping our young people navigate this world of social media and technology in general. Now the services our children are growing up with provide them unrivaled access to information. This is beyond what previous generations have experienced. And that includes learning opportunities, socialization, and much, much more. But we also clearly have a lot of work to do to better protect our children from the predators and predatory behavior that these technologies have enabled. And yes, Mr. Zuckerberg, that includes exacerbating the mental health crisis in America. Nearly all teens, we know, have access to smartphones and the internet and use the internet daily. And while guardians do have primary responsibility for caring for our children, the old ad says it takes a village. And so society as a whole, including leaders in the tech industry, must prioritize the health and safety of our children.
Now, dive into my questions now and be specific platform by platform, witness by witness on the topic of some of the parental tools you have each made reference to. Mr. Sethrin, how many minors are on discord and how many of them have caretakers that have adopted your family center tool? And if you don't have the numbers, just say that quickly and provide that to our office. We can follow up with you on that. How have you ensured that young people and their guardians are aware of the tools that you offer?
We make it very clear to use it to teens on our platform what tools are available. That sounds very unique. What specifically do you do, what may be clear to you is not clear to the general public. So what do you do in your opinion to make it very clear? So our teen safety assist, which is a feature that helps teens keep themselves safe in addition to blocking and blurring images that may be sent to them. That is on by default for teen accounts and it cannot be turned off. We also have, we market and to our teen users directly on our platform, we launched our family center. We created promotional video, we put it directly on our product so when every teen opened the app, in fact every user opened the app, they got an alert. Like hey Discord has this, they want you to use it. Thank you, look forward to the data that we're requesting for Mr. Zuckerberg.
Across all of Meta Services from Instagram, Facebook, Messenger and Horizon, how many minors use your applications and of those minors, how many have a caretaker that has adopted the parental supervision tools that you offer? Sorry, I can follow up with the specific stats on that. Okay, it would be very helpful not just for us to know but for you to know as a leader of your company. And how, same question, how are we ensuring that young people and their guardians are aware of the tools that you offer? We run pretty extensive ad campaigns, both on our platforms and outside. We work with creators and organizations like Girl Scouts to make sure that this is broadly, that there's broad awareness of the tools.
Okay, Mr. Spiegel, how many minors use Snapchat and of those minors, how many have caretakers that are registered with your family center? Center, I believe approximately in the United States, there are approximately 20 million teenage users of Snapchat. I believe approximately 200,000 parents use family center and about 400,000 teens have linked their account to their parents using family center. So 200,000, 400,000 sounds like a big number, but small percentage of the minors using Snapchat. What are your intentions to show that young people and their guardians are aware of the tools you offer? Center, we create a banner for family center on the user's profile so that accounts we believe maybe at the age that they could be parents can see the entry point into family center easily.
Mr. Chu, how many minors are on TikTok and how many of them have a caregiver that uses your family tools? Senator, I need to get back to you on the specific numbers, but we were one of the first platforms to give what we call family pairing to parents. You go to settings, you turn on the QR code, your teenager's QR code, and yours, you scan it. And what it allows you to do is you can set screen time limits, you can filter out some keywords, you can turn on the more restricted mode. And we're always talking to parents. I met a group of parents and teenagers and high school teachers last week to talk about a lot more we can provide in the family pairing mode.
Ms. Jacarena, how many minors use X and are you planning to implement safety measures or guidance for caretakers like your peer companies have? Thank you, Senator. Less than 1% of all U.S. users are between the ages of 13 and 17. Less than 1% of how many? Of 90 million U.S. users. Okay, so still hundreds of thousands continue. Yes, yes, and every single one is very important. Being a 14 month old company, we have re-prioritized child protection and safety measures, and we have just begun to talk about and discuss how we can enhance those with parental controls.
Let me continue with the follow-up question for Mr. Citron. In addition to keeping parents informed about the nature of various internet services, there's a lot more we always need to do. For today's purposes, while many companies offer a broad range of user empowerment tools, it's helpful to understand whether young people even find these tools helpful. So, appreciate you sharing your teen safety assist on the tools and how you're advertising it, but have you conducted any assessments of how these teachers are impacting minors' use of your platform? Our intention is to give teens tools capabilities that they can use to keep themselves safe and also so our teams can help keep teens safe. We recently launched teen safety assist last year, and I do not have a study off the top of my head, but we'd be happy to follow up with you on that.
Okay, my time is up. I'll have followed questions for each of you either in the second round or through statements for the record on some assessment of the tools that you've proposed. Thank you, Mr. Chair. Thank you, Senator Priya. Senator Kennedy. Thank you all for being here. Mr. Spiegel. I'll see you hiding down there. What does yada yada yada mean? I'm not familiar with the term, Senator. Very uncool. Can we agree that what you do, not what you say, what you do is what you believe and everything else is just cottage cheese? Yes, Senator. You agree with that? Speak up. Don't be shy. I've listened to you today. I've heard an adi yada yada yada yada yada yada. And I've heard you talk about the reforms you've made. And I appreciate them. And I've heard you talk about the reforms you're going to make. But I don't think you're going to solve the problem. I think Congress is going to have to help you. I think the reforms you're talking about some extent are going to be like putting paint on rotten wood. And I'm not sure you're going to support this legislation. I'm not. The fact is that you and some of your internet colleagues who are not here are no longer, you're not companies, you're countries. You're very, very powerful. And you and some of your colleagues who are not here have blocked everything we have tried to do in terms of reasonable regulation. Everything from privacy to child exploitation. And in fact, we have a new definition of recession. We know we're in a recession when Google has to lay off 25 members of Congress. That's what we're down to. We're also down to this fact that your platforms are hurting children. I'm not saying they're not doing some good things, but they're hurting children. And I know how to count votes. And if this bill comes to the floor of the United States Senate, it will pass. What we're going to have to do, and I say this with all the respect our Ken muster, is convince my good friend Senator Schumer to go to Amazon, buy a spine online and bring this bill to the Senate floor. And the House will then pass it. Now that's one person's opinion. I'm going to be wrong, but I doubt it.
Mr. Zuckerberg, let me ask you a couple of questions. My wax a little philosophical here. I have to hand it to you. You have, you have convinced over 2 billion people to give up all of their personal information. Every bit of it. In exchange for getting to see what their high school friends had for dinner Saturday night. That's pretty much your business model, isn't it? It's not how I would characterize it. And we give people the ability to connect with the people they care about and to engage with the topics that they care about.
And you take this information. This abundance of personal information. And then you develop algorithms to punch people's hot buttons. And send and steer to them information that punches their hot buttons again and again and again to keep them coming back and to keep them staying longer. And as a result, your users see only one side of an issue. And so to some extent your platform has become a killing field for the truth, hasn't it?
Senator, I disagree with that characterization. You know, we build ranking and recommendations because people have a lot of friends and a lot of interests and they want to make sure that they see the content that's relevant to them. We're trying to make a product that's useful to people and make our services as helpful as possible for people to connect with the people they care about and the interests they care about.
But you don't show them both sides. You don't give them balanced information. You just keep punching their hot buttons, punching their hot buttons. You don't show them balanced information so people can discern the truth for themselves. And you rev them up so much that so often your platform and others becomes just cesspools of snark where nobody learns anything, don't they?
Senator, I disagree with that. I think people can engage in the things that they're interested in and learn quite a bit about those. We have done a handful of different experiments and things in the past around news and trying to show content on a diverse set of perspectives. I think that there's more that needs to be explored there, but I don't think that we can solve that by ourselves.
Do you think I'm sorry to cut you off, Mr. President, but I'm going to run out of time. Do you think your users really understand what they're giving to you, all their personal information, and how you process it and how you monetize it? Do you think people really understand?
Senator, I think people understand the basic terms. I mean, I think that there's. I actually think that a lot of people have a budget for me. We spent a couple years as we talked about this. Does your user agreement still suck?
Senator, I'm not quite sure what you're referring to, but I think people get the basic deal of using these services. It's a free service. You're using it to connect to the people you care about. If you share something with people, other people will be able to see your information. It's inherently. If you're putting something out there to be shared publicly or with a private set of people, you're inherently putting it out there. So I think people get that basic part of how this is.
Mr. Zuckerberg, you're in the foothills of creepy. You track people who aren't even Facebook users. You track your own people, your own users who are your product, even when they're not on Facebook. I mean, I'm going to land this plane pretty quickly, Mr. Chairman. I mean, it's creepy. And I understand you make a lot of money doing it, but I just wonder if our technology is greater than our humanity. I mean, let me ask you this final question. Instagram is harmful to young people, isn't it?
Senator, I disagree with that. That's not what the research shows on balance. That doesn't mean that individual people don't have issues and that there aren't things that we need to do to help provide the right tools for people. But across all the research that we've done internally, I mean, this survey that the senator previously cited, there are 12 or 15 different categories of harm that we asked teens if they felt that Instagram at it worse or better. And across all of them, except for the one that Senator Hawley cited, more people said that using Instagram, you've got to land this plane. Mr. Zuckerberg, or positive? Let me, we just have to agree to disagree. If you believe that Instagram, I know it's, I'm not saying it's intentional, but if you agree that Instagram, if you think that Instagram is not hurting millions of our young people, particularly young teens, particularly young women, you shouldn't be driving. It is.
Thanks. Senator Butler. Thank you, Mr. Chair, and thank you to our panelists who've come to have an important conversation with us. Most importantly, I want to appreciate the families who have shown up to continue to be remarkable champions of your children and your loved ones for being here, and in particular to California families, that I was able to just talk to on the break, the families of Sammy Chapman from Los Angeles and Daniel Perta from Santa Clarita. They are here today and are doing some incredible work to not just protect the memory and legacy of their boys, but the work that they're doing is going to protect my nine-year-old. And that is indeed why we're here.
There are a couple questions that I want to ask some individuals. Let me start with a question for each of you. Mr. Citron, have you ever sat with a family and talked about their experience and what they need from your product? Yes or no? Yes, I have spoken with parents about how we can build tools to help them.
Mr. Spiegel, have you sat with families and young people to talk about your products and what they need from your product? Yes, Senator.
斯皮格尔先生,您是否与家庭和年轻人坐下来讨论过您的产品以及他们对产品的需求?是的,参议员。
Mr. Shoe? Yes, I just did it two weeks ago, for example. I don't want to know what you did for the hearing prep, Mr. Chiu. I just wanted to know, do you hear anything in terms of designing the product that you are creating?
Mr. Zuckerberg, have you sat with parents and young people to talk about how you design product for your consumers? Yes, over the years I've had a lot of conversations with parents. You know that's interesting, Mr. Zuckerberg, because we talked about this last night and you gave me a very different answer. I asked you this very question. Well, I told you that I didn't know what specific processes our company had before answering. No, Mr. Zuckerberg, you said to me that you had not. I must have misspoke. I want to give you the room to misspeak. Miss Zuckerberg, but I asked you this very question. I asked all of you this question and you told me a very different answer when we spoke. But I won't belabor it.
Can I, a number of you have talked about the, I'm sorry, ex, Miss Jekarino, have you talked to parents directly, young people, about designing your product? As a new leader of ex-the-answers, yes, I've spoken to them about the behavioral patterns because less than 1% of our users are in that age group, but yes, they have spoken to them. Thank you, ma'am.
Mr. Spiegel, there are a number of parents who have children have been able to access illegal drugs on your platform. What do you say to those parents? Senator, we are devastated that we cannot- To the parents. What do you say to those parents, Mr. Spiegel?
I'm so sorry that we have not been able to prevent these tragedies. We work very hard to block all search terms related to drugs from our platform. We proactively look for and detect drug-related content. We remove it from our platform, preserve it as evidence. And then we refer it to law enforcement for action. We've worked together with nonprofits and with families on education campaigns because the scale of the fentanyl epidemic is extraordinary. Over 100,000 people lost their lives last year and we believe people need to know that one pill can kill. That campaign reached more than 200, was viewed more than 260 million times on Snapchat.
Mr. Spiegel, there are two fathers in this room who lost their sons. They're 16 years old. Their children were able to get those pills from Snapchat. I know that there are statistics and I know that there are good efforts. None of those efforts are keeping our kids from getting access to those drugs on your own. Those drugs on your platform. As California Company, all of you, I've talked with you about what it means to be a good neighbor and what California families and American families should be expecting from you. You owe them more than just a set of statistics. And I look forward to you showing up on all pieces of these legislation, all of you, showing up on all pieces of legislation to keep our children safe.
Mr. Zuckerberg, I want to come back to you. I talked with you about being a parent to a young child who's done have a phone, doesn't, you know, is not on social media at all. And one of the things that I am deeply concerned with as a parent to a young black girl is the utilization of filters on your platform. That would suggest to young girls utilizing your platform the evidence that they are not good enough as they are.
I want to ask more specifically and refer to some unredacted court documents that reveal that your own researchers concluded that these face filters, that mimic plastic surgery. Negatively impact youth mental health indeed and will be, why should we believe? Why should we believe that because that you are going to do more to protect young women and young girls when it is that you give them the tools to affirm the self-hate that is spewed across your platforms? Why is it that we should believe that you are committed to doing anything more to keep our children safe? Sorry, there's a lot to unpack there. There is a lot of tools to express themselves in different ways. People use face filters and different tools to make media and photos and videos that are fun or interesting across a lot of the different products that are created. Plastic surgery pins are good tools to express creativity.
Senator, I'm not speaking to that. Skin lightning tools are tools to express creativity. This is the direct thing that I'm asking about. I'm not defending any specific one of those. I think that the ability to kind of filter and edit images is generally a useful tool for expression. Specifically, I'm not familiar with the study that you're referring to, but we did make it so that we're not recommending this type of content to team. I may know no reference to a study to court documents that revealed your knowledge of the impact of these types of filters on young people, generally young girls in particular. Senator, I maybe disagree with that characterization. I think that there's sort of been high court documents. I haven't seen any documents. Okay, Mr. Zuckerberg, my time is up. I hope that you hear what is being offered to you and are prepared to step up and do better. I know this Senate committee is going to do our work to hold you in greater account. Thank you, Mr. Chair.
Senator Tillis. Thank you, Mr. Chair. Thank you all for being here. I don't feel like I'm going to have an opportunity to ask a lot of questions, so I'm going to reserve the right to submit some for the record. But I have heard, we've had hearings like this before. I've been in the Senate for nine years. I've heard hearings like this before. I've heard horrible stories about people who have died, committed suicide, been embarrassed. Every year, we have an annual flogging every year. And what materially has occurred over the last nine years?
Do any of you all, just yesterday, no question, do any of you all participate in an industry consortium trying to make this fundamentally safe across platforms? Yes or no, Mr. Zuckerberg. Yes. There's a variety of organizations that we work with. Which organization is this? Does anyone here not participate in an industry, I actually think it would be a moral for you all to consider it a strategic advantage to keep profit, something that would secure all these platforms to avoid this sort of, do you all agree with that? That anybody that would be saying you want ours, because ours is the safest and these haven't figured out the secret sauce that you as an industry realize this is an existential threat to you all, we don't get it right, right? I mean, you've got to secure your platforms, you've got to deal with this. Do you not have an inherent mandate to do this? Because it would seem to me if you don't, you're going to cease to exist. I mean, we could regulate you out of business if we wanted to. And the reason I'm saying, it may sound like a criticism, it's not a criticism. I think we have to understand that there should be an inherent motivation for you to get this right. Our Congress will make a decision that could potentially put you out of business.
Here's the reason I have a concern with that though. I just went on the internet while I was listening intently to all the other members speaking, and I found a dozen different platforms outside the United States, ten of which are in China, two of which are in Russia. They're daily average, active membership numbers in the billions.
Well, people say you can't get on China's version of TikTok. I took me one quick search on my favorite search engine to find out exactly how I could get an account on this platform today. And so the other thing that we have to keep in mind, I come from technology.
I could figure out, ladies and gentlemen, I could figure out how to influence your kid without them ever being on a social media platform. I can randomly send text and get a bite and then find out an email address and get compromising information. It is horrible to hear some of these stories that I have shared and I've had these stories occur in my hometown down in North Carolina.
But if we only come here and make a point today and don't start focusing on making a difference, which requires people to stop shouting and start listening and start passing language here, the bad actors are just going to be off our shores.
I have another question for you all. How many people, roughly, if you don't know the exact numbers, roughly, how many people do you have looking 24 hours a day at these horrible images and just go real quick with an answer down the line and filtering? And filtering it out.
It's most of the 40,000 about people who work on safety.
这里大部分是约40,000名从事安全工作的人们。
And again? We have 2,300 people all over the world.
再来一次?我们在全球有2300人。
Okay. We have 40,000 trust and safety professionals around the world. We have approximately 2,000 people dedicated to trust and safety and content moderation. Our platform is much smaller than these folks. We have hundreds of people and it's looking at the content in 15% of our work. I'm not really mentioning these people have a horrible job. Many of them experience. They have to get counseling for all the things they see. We have evil people out there.
And we're not going to fix this by shouting past our talk and past each other. We're going to fix this by everyone of y'all being at the table and hopefully coming closer to what I heard one person say, supporting a lot of the good bills like one that I hope Senator Blackburn mentions when she gets a chance to talk.
But guys, if you're not at the table and securing these platforms, you're going to be on it. And the reason why I'm not okay with that is that if we ultimately destroy your ability to create value and drive you out of business, the evil people will find another way to get to these children.
And I do have to admit, I don't think my mom's watching this one, but there is good. We can't look past good that is occurring. I'm a mom who lives in Nashville, Tennessee, and I talked yesterday and we talked about a Facebook post that she made a couple of days ago. We don't let her talk to anybody else. That connects my 92-year-old mother with her grandchildren and great-grandchildren. That lets a kid who may feel awkward in school to get into a group of people and relate to people. Let's not throw out the good because we haven't all together focused on rooting out the bad.
Now, I guarantee you, I could go through some of your governance documents and find a reason to flag every single one of you because you didn't place the emphasis on it that I think you should. But at the end of the day, I find it hard to believe that any of you people started this business, some of you in your college dorm rooms, for the purposes of creating the evil that is being perpetrated on your platforms. But I hope that every single waking hour, you're doing everything you can to reduce it. You're not going to be able to eliminate it.
And I hope that there are some enterprising young tech people out there today that are going to go to parents and say, ladies and gentlemen, your children have a deadly weapon. They have a potentially deadly weapon, whether it's a phone or a tablet. You have to secure it. You can't assume that they're going to be honest and say that they're 16 when they're 12. We all have to recognize that we have a responsibility to play, and you guys are at the tip of the spear.
So I hope that we can get to a point to where we are moving these bills. If you've got a problem with them, state your problem, let's fix it. No is not an answer. And know that I want the United States to be the beacon for innovation, to be the beacon for safety, and to prevent people from using other options that have existed since the Internet has existed to exploit people. And count me in as somebody that will try and help out. Thank you, Mr. Chair. Thank you, Senator. Tell us next to Senator Ossoff.
Thank you, Mr. Chairman, and thank you to our witnesses today. Mr. Zuckerberg, I want to begin by just asking a simple question, which is, do you want kids to use your platform more or less?
Well, we don't want people under the age of 13 using your platform. Do you want teenagers 13 and up to use your platform more or less? Well, we would like to build a product that is useful and that people want to use. My time is going to be limited, so it's just, do you want them to use it more or less? Teenagers 13 to 17 years old, do you want them using metaproducts more or less? I'd like them to be useful enough that they want to use them more.
You want them to use it more? I think herein we have one of the fundamental challenges. In fact, you have a fiduciary obligation, do you not, to try to get kids to use your platform more? It depends on how you define that. We obviously are a business. I'm sorry, Mr. Zuckerberg, just our time is not, it's self-evident that you have a fiduciary obligation to get your users, including users under 18 to use and engage with your platform more rather than less, correct? Over the long term, but in the near term we often take a lot of steps, including we made a change to show less videos on the platform that reduced amount of time by more than 50 million hours.
Okay, but if your shareholders ask you, Mark, I wouldn't, Mr. Zuckerberg here, but your shareholders might be on a first-name basis with you, Mark, are you trying to get kids to use metaproducts more or less? You'd say more, right? Well, I would say that over the long term we're trying to create the most of the time. Yeah, I mean, let's look, so the 10K you file with the SEC, a few things I want to note, here are some quotes, and this is a filing that you signed, correct? Yes. Yeah. Our financial performance has been and will continue to be significantly determined by our success in adding, retaining, and engaging active users.
Here's another quote. If our users decrease their level of engagement with our products, our revenue, financial results, and business may be significantly harmed. Here's another quote. We believe that some users, particularly younger users, are aware of and actively engaging with other products and services, similar to as a substitute for ours. It continues in the event that users increasingly engage with other products and services, we may experience a decline in use and engagement in key demographics or more broadly, in which case our business would likely be harmed.
You have an obligation, as the chief executive, to encourage your team to get kids to use your platform more. Senator, I think this is. Is that not self-evident? You have a fiduciary obligation to your shareholders to get kids to use your platform more.
I think that the thing that's not intuitive is the direction is to make the products more useful, so that way people want to use them more. We don't give our. the team is running the Instagram feed or the Facebook feed a goal to increase the amount of time that people spend. Yeah, but you don't dispute. And your 10K makes clear you want your users engaging more and using more the platform.
And I think this gets to the root of the challenge because it's the overwhelming view of the public, certainly in my home state of Georgia. And we've had some discussions about the underlying science that this platform is harmful for children. I mean, you are familiar with. and not just your platform, by the way, social media in general. 2023 report from the Surgeon General about the impact of social media on kids' mental health, which cited evidence that kids who spend more than three hours a day on social media have double the risk of poor mental health outcomes, including depression and anxiety. Are you familiar with that Surgeon General report in the underlying study? I read the report, yes.
Do you dispute it? No, but I think it's important to characterize it correctly. I think what he was flagging in the report is that there seems to be a correlation. And obviously the mental health issue is very important, so it's something that needs to be studied further. Yeah, we know. The thing is, that's. everyone knows there's a correlation. Everyone knows that kids who spend a lot of time, too much time on your platforms, are at risk. And it's not just the mental health issues. I mean, let me ask you no question. Is your platform safe for kids? I believe it is.
But there's a. The difference between correlation and causation? Because we're not going to be able to get anywhere. We want to work in a productive, open, honest, and collaborative way with the private sector to pass legislation that will protect Americans, that will protect American children above all, and that will allow businesses to thrive in this country. We don't start with an open, honest, candid, realistic assessment of the issues. We can't do that. The first point is you want kids to use the platform more.
In fact, you have an obligation to. But if you're not willing to acknowledge that it's a dangerous place for children, the Internet is a dangerous place for children.
Not just your platform, isn't it? Isn't the Internet a dangerous place for children? I think it can be. Yeah, there's both great things that people can do, and there are harms that we need to work together.
Yeah, it's a dangerous place for children. There are families here who have lost their children. There are families across the country whose children have engaged in self-harm, who have experienced low self-esteem, who have been sold deadly pills on the Internet. The Internet is a dangerous place for children, and your platforms are dangerous places for children. Do you agree? I think that there are harms that we need to work together.
I'm not going to, I think overall there is. Why not? Why not? Why not just acknowledge it? Why do we have to do the very careful question? I disagree with the characterisation that you have. Which character that the Internet is a dangerous place for children? I think you're trying to characterize our products as inherently dangerous, and I think that for sure.
Inherent or not, your products are places where children can experience harm. They can experience harm to their mental health. They can be sold drugs. They can be preyed upon by predators. They're dangerous places. And yet, you have an obligation to promote the use of these platforms by children.
All I'm trying to suggest to you, Mr. Zuckerberg, my time is running short. Is that in order for you to succeed, you and your colleagues here, we have to acknowledge these basic truths. We have to be able to come before the American people, the American public, the people in my state of Georgia, and acknowledge the Internet is dangerous. Including your platforms. There are predators lurking. There are drugs being sold. There are harms to mental health that are taking a huge toll on kids' quality of life. And yet, you have this incentive, not just you, Mr. Zuckerberg. All of you have an incentive to boost, maximize use, utilization and engagement. And that is where public policy has to step in to make sure that these platforms are safe for kids. People are not dying, so kids are not overdosing, so kids are not cutting themselves or killing themselves because they're spending all day scrolling instead of playing outside. And I appreciate all of you for your testimony. We will continue to engage as we develop this legislation. Thank you.
Senator from Tennessee. Thank you, Mr. Chairman.
Thank you for each of you for coming. And I know some of you had to be subpoenaed to get here. But we do appreciate that you all are here.
Mr. Chu, I want to come to you first. We've heard that you're looking at putting a headquarters in Nashville. And likewise in Silicon Valley and Seattle. And what you're going to find probably is that the welcome mat is not going to be rolled out for you in Nashville. Like it would be in California. There are a lot of people in Tennessee that are very concerned about the way TikTok is basically building dossier's on our kids. The way they are building those on their virtual you. And also that that information is held in China in Beijing as you responded to Senator Blumenthal and I last year in reference to that question. And we also know that a major music label yesterday said they were pulling all of their content off your site. Because of your issues on payment on artificial intelligence and because of the negative impact on our kids mental health. So we will see how that progresses.
Mr. Zuckerberg, I want to come to you. We have just had Senator Blumenthal and I of course have had some internal documents and emails that have come our way. One of the things that really concerned me is that you referred to your young users in terms of their lifetime value of being roughly $270 per teenager. And each of you should be looking at these kids. Their T-shirts they're wearing today say I'm worth more than $270. We've got some standing up in those T-shirts. Now, and some of the children from our state, some of the children, the parents that we have worked with. Just to think whether it is Becca Schmidt, David Malak, Sarah Flatt, Ann Lee Shout. Would you say that life is only worth $270? What could possibly lead you? I mean, I listened to that. I know you're a dad. I'm a mom. I'm a grandmom. And how could you possibly even have that thought? It is astounding to me. And I think this is one of the reasons that states, 42 states are now suing you because of features that they consider to be addictive that you are pushing forward. And in the emails that we've got from 2021 that go from August to November, there is the staff plan that is being discussed. In Antigone Davis, Nick Clegg, Sheryl Sandberg, Chris Cox, Alec Schulz, Adam Messery are all on this chain of emails on the well-being plan. And then we get to one, Nick did email Mark for emphasis to emphasize his support for the package, but it sounds like it lost out to various other pressures and priorities. See, this is what bothers us. Children are not your priority. Children are your product. Children, you see as a way to make money. And children protecting children in this virtual space, you made a conscious decision. Even though Nick Clegg and others were going through the process of saying, this is what we do. These documents are really illuminating. And it just shows me that growing this business, expanding your revenue, what you were going to put on those quarterly filings, that was the priority. And the other thing that I want to talk about is the children were not. It's very clear.
I want to talk with you about the pedophile ring because that came up earlier. And the Wall Street Journal reported on that. And one of the things that we found out was after that became evident, then you didn't take that content down. And you didn't take it down because it didn't violate your community standards. Do you know how often a child is bought or sold for sex in this country? Every two minutes, every two minutes, a child is bought or sold for sex. That's not my stat. That is a TBI stat. Now, finally, this content was taken down after a congressional staff or went to Meta's global head of safety. So would you please explain to me and to all these parents why explicit predatory content does not violate your platforms, terms of service, or your community standards?
Sure, so let me try to address all of the things that you just said. It does violate our standards. We work very hard to take it down. We've reported, I think it's more than 26 million examples of this kind of content. Didn't take it down until a congressional staffer brought it up. It may be that in this case we made a mistake and missed something. I think you make a lot of mistakes. So let's move on.
I want to talk with you about your Instagram creators program and about the push we found out through these documents that you actually are pushing forward because you want to bring kids in early. You see these younger teenagers as valuable but an untapped audience quoting from the emails and suggesting teams are actually household influencers to bring their younger siblings into your platform, into Instagram. Now, how can you ensure that Instagram creators, your product, your program does not facilitate illegal activities when you fail to remove content pertaining to the sale of minors? And it is happening once every two minutes in this country.
Senator, are tools for identifying that kind of content or industry leading? That doesn't mean we're perfect. There are definitely issues that we have. But we continue this. Yes, there are a lot that is slipping through. It appears that you're trying to be the premier sex trafficking site. Of course not, Senator. That's ridiculous. No, it is not ridiculous. Of course we don't want this. We don't want this content on our platforms. Why don't you take it down? We do take it down. We do more work to take it down than they've been. No, you're not. You are not.
And the problem is we've been working on this. Senator Welch is over there. We've been working on this stuff for a decade. You have an army of lawyers and lobbyists that have fought us on this every step of the way. You work with net choice, the Cato Institute, taxpayers protection alliance and chamber of progress to actually fight our bipartisan legislation to keep kids safe online. So are you going to stop funding these groups? Are you going to stop lobbying against this and come to the table and work with us? Yes or no? Senator, we have a. Yes or no? Of course we'll work with you on the legislation. Okay. The door is open. We've got all these bills. You need to come to the table. Each and every one of you need to come to the table. And you need to work with us. Kids are dying.
Senator Welch? I'm going to thank my colleague, Senator Blackburn, for a decade of work on this. I actually have some optimism. There is a consensus today that didn't exist, say, ten years ago, that there is a profound threat to children, to mental health, to safety. There's not a dispute. That was in debate before. That's a starting point. Secondly, we're identifying concrete things that can be done in four different areas. One is industry standards. Two is legislation. Three are the courts. And then four is a proposal that Senator Bennett, Senator Graham, myself and Senator Warren have to establish an agency, a governmental agency whose responsibility would be to engage in this on a systematic, regular basis with proper resources. And I just want to go through those.
I appreciate the industry standard decisions and steps that you've taken in your companies. But it's not enough. And that's what I think you're hearing from my colleagues. Like, for instance, where there are layoffs in it is in the trusted, the trust and verify programs. That's alarming because it looks like there is a reduction in emphasis on protecting things. Like, you just added the, you know, 100 employees in Texas in this category. And how many did you have before? The company is just coming through a significant restructuring. So we've increased the number of trust and safety employees and agents all over the world by at least 10% so far. And so far in the last 14 months, and we will continue to do so specifically in Austin, Texas.
All right, Mr. Zuckerberg, my understanding is there have been layoffs in that area as well. There's added jobs there at Twitter, but it matter. Have there been reductions in that? There have been across the board not really focused on that area. I think our investment is relatively consistent over the last couple of years. We invested almost $5 billion in this work last year, and I think this year will be on the same order of magnitude.
All right, and another question that's come up is when to the horror of a user of any of your platforms, somebody has an image on there that's very compromising, often of a sexual nature. Is there any reason in the world why a person who wants to take that down can't have a very simple same day response to have it taken down?
I'll start with Twitter. I'm sorry, Senator. I was taking notes. Could you repeat the question?
我会从Twitter开始。对不起,参议员。我在做笔记。您能再重复一遍问题吗?
Well, there's a lot of examples of a young person finding out about an image that is of them and really compromises them and actually can create suicidal thoughts, and they want to call up or they want to send an email and say, take it down.
I mean, why is it not possible for that to be responded to immediately?
我的意思是,为什么不可能立即得到回应呢?
We all strive to take down any type of a violative content or disturbing content immediately. At X we have increased our capabilities with a two-step reporting process.
我们所有人都努力立即删除任何违规或令人困扰的内容。在X公司,我们通过双重举报程序来增强我们的能力。
If I'm a parent or I'm a kid and I want this down, shouldn't there be methods in place where it comes down?
如果我是一个父母或者孩子,并且我想要这个东西下架,难道就不应该有相应的措施来让它下架吗?
You see what the image is? Yes. And me, ecosystem-wide standard would improve and actually enhance the experience for users at all our platforms.
There actually is an organization that I think a number of the companies up here are a part of called Take It Down. It's some technology that we and a few others build that basically. You all are in favor of that. This is going to give some peace of mind to the people. It really, really matters.
实际上,我认为这里有一家组织,很多公司都加入了,叫做"Take It Down"。我们和其他几家公司研发了一项技术,这项技术可以为大家提供一些安心感。这对人们来说真的非常重要。
I don't have that much time. So we've talked about the legislation and Senator Whitehouse had asked you to get back with your position on Section 230, which I'll go to in a minute. But I would welcome each of you responding as to your company's position on the bills that are under consideration in this hearing. All right? I'm just asking you to do that.
Third, the Court, this big question of Section 230. And today, I'm pretty inspired by the presence of the parents who have turned their extraordinary grief into action and hope that other parents may not have to suffer. What for them is a devastating for everyone, a devastating loss.
Senator Whitehouse asked you all to get back very concretely about Section 230 and your position on that. But it's an astonishing benefit that your industry has, that no other industry has. They just don't have to worry about being held accountable in court if they're negligent.
So you've got some explaining to do, and I'm just reinforcing Senator Whitehouse's request that you get back specifically about that. And then finally, I want to ask about this notion. It's this idea of a federal agency who's resourced and whose job is to be dealing with public interest matters that are really affected by big tech.
It's extraordinary what has happened in our economy with technology, and your companies represent innovation and success. But just as when the railroads were ascendant and were in charge and ripping off farmers because of practices, they were able to get away with.
Just as when Wall Street was flying high, but there was no unregulating blue sky laws, we now have a whole new world in the economy.
就像当华尔街飞黄腾达时,却没有解除蓝天法规一样,如今我们在经济中也迎来了一个全新的世界。
And Mr. Zuckerberg, I remember you testifying in the Energy and Commerce Committee, and I asked you your position on the concept of a federal regulatory agency. My recollection is that you were positive about that. Is that still the case?
I think it could be a reasonable solution. There are obviously pros and cons to doing that versus through the normal, the current structure of having different regulatory agencies focused on specific issues. But because a lot of the things trade off against each other, one of the topics that we talked about today is encryption, and that's obviously really important for privacy and security. Can we just go down the line?
Senator, I think the industry initiative to keep those conversations going would be something X would be very, very proactive about. If you think about our support of the report act, the SHIELD Act, the Stop CSAM Act, our support of the Project Safe Child today. I think our intentions are clear to participate in the trade here. Senator, we support national privacy legislation, for example, so that sounds like a good idea. We just need to understand what it means.
Senator, we'll continue to work with your team, and we'd certainly be open to exploring the right regulatory body for big technology. But the idea of a regulatory body is something that you can see has merit.
We're very open to working with you and our peers and anybody on helping make the Internet a safer place. I think you mentioned this is not a one-platform problem, so we do look to collaborate with other companies and with nonprofits in the government.
Thank you all. Mr. Chairman, I yield back. Thank you, Senator Welch.
非常感谢大家。主席先生,我让步了。谢谢您,韦尔奇参议员。
We're going to conclude this hearing, and thank you all for coming today. You probably have your scorecard out there. You've met at least 20 members of this committee and have your own impressions of their questioning and approach and the like. But the one thing I want to make clear as Chairman of this committee for the last three years is this was an extraordinary vote on an extraordinary issue a year ago. We passed five bills unanimously in this committee. You heard all the senators. Every spot on the political spectrum was covered. Every single senator voted unanimously in favor of the five pieces of legislation we've discussed today. It ought to tell everyone who follows Capitol Hill in Washington a pretty stark message. We get it. And we live it. As parents and grandparents, we know what our daughters and sons and others are going through. They cannot cope. They cannot handle this issue on their own. They're counting on us as much as they're counting on the industry to do the responsible thing.
And some will leave with impressions of our witnesses and the companies they represent that you're right as an American citizen. But you want to also leave with the determination to keep the spotlight on us to do something, not just to hold a hearing, bring out a good strong crowd of supporters for change, but to get something done. No excuses. No excuses. We've got to bring this to a vote. What I found in my time in the House and the Senate is that's the moment of reckoning. Speeches notwithstanding press releases and the like. The moment of reckoning is when we call a vote on these measures. It's time to do that. I don't believe there's ever been a moment in America's wonderful history where the business or industry has stepped up and said regulate us, put some legal limits on us. And businesses exist by and large to be profitable. And I think that we got to get behind that and say profitability at what cost. Senator Kennedy, Republican colleagues said, is our technology greater than our humanity? I think that is a fundamental question that he asked. When I would add to it, or our politics greater than technology, we're going to find out.
I want to thank a few people before we close up here. I've got several staffers who worked so hard on this. Alexander Galber. Thank you very much. Alexander. Jeff Hanson. Scott Jordan.
The last point I'll make, Mr. Zuckerberg, is just a little advice to you. I think your opening statement on mental health needs to be explained because I don't think it makes any sense. There is an apparent in this room who's had a child that's gone through an emotional experience like this that wouldn't tell you and me they changed right in front of my eyes. They changed. They hold themselves up in their room. They no longer reached out to their friends. They lost all interest in school. These are mental health consequences that I think come with the abuse of this right. They have access to this kind of technology.