TikTok, Snapchat and YouTube, all social media sites popular with teens and young adults, faced a barrage of questions and accusations Tuesday from lawmakers who want the companies to do more to protect children online.
Executives from all three companies committed to sharing internal research on how their products affect kids -- an issue that has come to the forefront in the past several weeks as tens of thousands of pages of Facebook's internal documents have been revealed by a whistleblower.
It was the first time testifying before the legislative body for TikTok and Snap, the parent company of Snapchat, despite their popularity and Congress' increasing focus on tech industry practices. By contrast, Facebook representatives have testified 30 times over the past four years, and Twitter executives have testified 18 times total.
Tuesday's hearing, convened by Sen. Richard Blumenthal, D-Conn., in front of the Senate Commerce Committee's consumer protection panel, drilled into how children's data is protected online, how features such as autoplay and "likes" affect teenagers' experiences, and what the companies are doing to rid their sites of harmful behavior, including bullying and drug sales.
It is unclear exactly what data the companies agreed to disclose and whether they'll disclose new research they conduct.
After the Wall Street Journal reported on research that Facebook had conducted into how Instagram affects teens, the company released heavily redacted and annotated slide decks on the findings. Lawmakers have called on the company to release its full trove of research on the matter, and some have suggested subpoenaing the company to get it.
Blumenthal warned the testifying companies that simply being different from Facebook is not a defense.
"That bar is in the gutter," he said. "What we want is not a race to the bottom but, really, a race to the top."
Blumenthal accused the companies of drawing young people further and further into their products: "Everything that you do is to add users, especially kids, and keep them on your apps for longer."
Facebook has been under fire for the way its sites, particularly Instagram, harm teenagers' mental health after a whistleblower revealed a trove of internal documents. Some showed that some teen girls reported Instagram made their body image issues worse.
Last month, Facebook executive Antigone Davis testified and faced accusations from senators that the company buried research about how its products may harm children. Facebook has defended its track record, and Davis said at the hearing that the company's research in fact showed that teen girls struggling with mental health issues largely reported that they found Instagram to be more helpful than not.
When Facebook whistleblower Frances Haugen testified before the subcommittee this month, lawmakers said her disclosures could mark a turning point in efforts to regulate the tech giants. "I think the time has come for action, and I think you are the catalyst for that action," Sen. Amy Klobuchar, D-Minn., told Haugen.
"There has been a deafening ... drumbeat of continuing disclosures about Facebook. They have deepened America's concern and outrage and have led to increasing calls for accountability, and there will be accountability," Blumenthal said. "This time is different."
Lawmakers have increasingly begun to discuss legislation meant to better protect children. A group of House lawmakers has proposed a bill that would open the platforms up to litigation if their algorithms amplify content tied to severe harm. And Blumenthal suggested Tuesday that U.S. officials could adopt a children's design code similar to one that recently took effect in Britain that applies new rules to how companies use children's data.
Any new rules would have to make it through a gridlocked Congress. But child protection proposals don't necessarily face the partisan divides that may stymie other attempts to regulate the tech giants.
"It's one of the few areas where Congress can actually do something, and there is bipartisan consensus," said Nu Wexler, a former communications staff member for tech companies and lawmakers. "For legislators, in some ways child safety is the path of least resistance."
Snap and TikTok have faced far less scrutiny from the government, including for how they affect children, despite having huge numbers of users. TikTok says it has more than 1 billion monthly users, though it does not break down their ages. Snapchat has 500 million monthly active users and says more than 80% of its U.S. users are older than 18.
Even YouTube, where billions of videos are watched every day, has been overlooked at times by government tech investigations. Experts say this hearing is a good start at examining companies other than the biggest few.
"Facebook is just not the only game in town," said Harvard Law School lecturer Evelyn Douek, who studies the regulation of online speech. "If we're going to talk about teen users, we should talk about the platforms that teens actually use, which is TikTok, Snapchat and YouTube."
The executives defended their approach Tuesday, arguing that they continually build features to better protect young users.
While the executives all broadly expressed support for legislation to boost protections for kids, including on privacy, senators expressed frustration that they wouldn't commit to supporting specific proposals.
Sen. Ed Markey, D-Mass., a top advocate for children's online safety who has introduced a bill to expand safeguards under federal children's privacy laws, hammered some of the companies for not taking a firm stance on the measure.
After Snap executive Jennifer Stout declined to support his measure, Markey said: "This is just what drives us crazy. 'We want to talk, we want to talk, we want to talk.' This bill's been out there for years, and you still don't have a view on it. Do you support it or not?"
Stout replied, "I think there are things we would like to work with you on, senator."
TikTok executive Michael Beckerman said the company would be "happy to support" the bill if lawmakers made an improvement to how it deals with verifying children's age online.
Blumenthal echoed Markey's frustration when the executives declined to come out in support of his bill to make it easier to sue companies over child exploitative material on their sites.
"This is the talk that we've seen again and again and again and again: 'We support the goals.' But that's meaningless if you don't support the legislation," he said.
The three companies have faced some public backlash for the way they treat children. YouTube parent Google in 2019 agreed to pay $170 million to settle allegations that it illegally collected data about children younger than 13 who watched toy videos and television shows on its site.
Snapchat and TikTok have both faced pressure to stop illegal drug sales and connections on their sites, particularly as overdose deaths have soared. Parent groups have called on the sites to do more to stop drug trafficking as children die of fentanyl poisoning.
Klobuchar questioned Stout on the company's actions to rid the app of drug dealers, something Stout said was a priority for the company.
Still, Klobuchar suggested changing the law to hold companies liable could speed up the process.
"So maybe that will make you work even faster, so we don't lose another kid," she said.
Several senators also brought up teen's mental health, especially as it relates to eating disorders. The companies all said that any material encouraging eating disorders violates their content, and that they work to keep it off their sites and instead point users to expert sources.
"We, again, prohibit the type of content that glorifies or promotes these issues, such as eating disorders," YouTube executive Leslie Miller said.
Miller refused to be pinned down on company research in a series of exchanges with senators. When Blumenthal asked whether the companies would allow independent researchers access to algorithms, data sets and data privacy practices, Miller responded, "It would depend on the details, but we're always looking to partner with experts in these important fields."
Blumenthal shot back that YouTube's answer "indicates certainly a strong hesitancy if not resistance to providing access."
Lawmakers also spent significant time grilling TikTok on its ownership -- its parent company is Chinese firm ByteDance -- after Sens. Marsha Blackburn, R-Tenn., Ted Cruz, R-Texas, and John Thune, R-S.D., brought up concerns about data privacy.
Beckerman said TikTok's information about U.S. users is stored in this country, echoing what the company has said in the past.
Critics have long argued that the company would be obligated to turn Americans' data over to the Chinese government if asked.
"Access controls for our data is done by our U.S. teams," said Beckerman. "And as independent researchers, independent experts have pointed out, the data that TikTok has on the app is not of a national security importance and is of low sensitivity."
Beckerman also said in a statement before the hearing, "We know that trust must be earned through action, and we continue to build age-appropriate experiences for teens throughout their development and empower families with parental controls."
TikTok disables direct messages for accounts whose owners are younger than 16 and sets direct messages off by default for 16- and 17-year-olds.
Snap has emphasized its safety features, including showing users' locations on a map feature only to friends they have added.
Stout also sought to differentiate the platform from some of its competitors. She said social media "evolved to feature an endless feed of unvetted content, exposing individuals to a flood of viral, misleading and viral information. Snapchat is different. Snapchat was built as an antidote to social media."
Beckerman also said TikTok was different from other platforms that focus more on direct communication between users. "It's about uplifting, entertaining content," he said. "People love it."
Information for this article was contributed by Rachel Lerman and Cristiano Lima of The Washington Post; and by David McCabe, Kate Conger and Daisuke Wakabayashi of The New York Times.