Feb 5, 2024

The social media hearings at Congress.

Mark Zuckerberg, CEO of Meta, apologizes to families. Image: Tom Williams/CQ-Roll Call, Inc via Getty Images
Mark Zuckerberg, CEO of Meta, apologizes to families who have been harmed due to unsafe social media. Image: Tom Williams/CQ-Roll Call, Inc via Getty Images

Plus, does the "deep state" really exist?

I’m Isaac Saul, and this is Tangle: an independent, nonpartisan, subscriber-supported politics newsletter that summarizes the best arguments from across the political spectrum on the news of the day — then “my take.”

Are you new here? Get free emails to your inbox daily. Would you rather listen? You can find our podcast here.


Today's read: 13 minutes.

💻
What should Congress be doing about the dangers of social media? Plus, is the "deep state" a real thing?

Reader feedback.

I'm going to make a more concerted effort to share reader feedback in each newsletter. Typically, I'll try to share criticisms of our coverage or "my take" to offer an alternate opinion. But here is another example of two reactions to the same newsletter, Friday’s proposal for comprehensive immigration reform:

  • "Your view seems to have shifted left and we have enough slant to the left now. Hard to believe you can’t see it! Unlike the left, however, I am not trying to deprive you of your view. I just don’t want to hear it."
  • "I am now aware that a majority of your subscribers are conservatives, so naturally you have to placate them to some degree. While I appreciate your efforts, it's clear you have a lean-right perspective of some necessity."

Quick hits.

  1. President Biden won the South Carolina Democratic primary with 96% of the vote, outperforming some polls in the state by as much as 20%. (The results
  2. Senate negotiators released the text of their Ukraine and Israel funding bill that also functions as border security legislation. The $118 billion bill includes $20 billion for the border and increases security, limits how asylum can be used to enter the U.S., and provides funding for more judges and asylum officers. (The bill)
  3. The federal trial brought by Special Counsel Jack Smith against former President Donald Trump for alleged election interference, initially scheduled to begin March 4, has been delayed indefinitely while Trump argues in a federal appeals court that he is immune from prosecution. (The delay) Separately, more than $50 million of former President Trump's political fundraising was reportedly spent on legal fees last year. (The fees
  4. The U.S. economy added 353,000 non-farm jobs in January, far exceeding expectations. Separately, the Federal Reserve opted not to cut interest rates, leaving them between 5.25% and 5.5%. (The report
  5. U.S. forces struck 85 targets affiliated with Iran-backed militants in Iraq and Syria. Separately, U.S. and United Kingdom forces continued to strike Houthi outposts in Yemen. (The strikes)

Today's topic.

The social media hearings in Congress. On Wednesday, a group of executives from Meta, TikTok, Snap, Discord, X (formerly known as Twitter), and other social media companies testified in a Senate Judiciary Committee hearing about the threats of social media and child exploitation. The hearing began with testimony from victims who said they or their children had been exploited on these platforms. Parents who had lost children to suicide held up photos of their children throughout the hearings.

“They’re responsible for many of the dangers our children face online,” Senate Majority Whip Dick Durbin (D-IL) said in opening remarks about the executives. “Their design choices, their failures to adequately invest in trust and safety, their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk.”

Perhaps the most notable moment came during testimony from Meta CEO Mark Zuckerberg, who was being questioned by Sen. Josh Hawley (R-MO). At one point, Hawley asked Zuckerberg if he had personally compensated any victims or their families for what they have been through.

“I don’t think so,” Zuckerberg said.

“There’s families of victims here,” Hawley said. “Would you like to apologize to them?”

Zuckerberg then stood and turned away from the senators and directly addressed the parents in the gallery and apologized.

“I’m sorry for everything you have all been through," he said. "No one should go through the things that your families have suffered and this is why we invest so much and we are going to continue doing industry-wide efforts to make sure no one has to go through the things your families have had to suffer.”

Despite bipartisan consensus about the dangers of these platforms, members of Congress have yet to rally behind any specific legislative solutions. In 2022, the Kids Online Safety Act was proposed on a bipartisan basis but has yet to gain traction. The bill would require social media platforms, video game websites, and messaging apps to take "reasonable measures" to prevent harm like bullying, harassment, sexual exploitation, self-harm, predatory marketing, and eating disorders. It would require every service to automatically have the highest safety and privacy settings for any users under the age of 18.

TikTok CEO Shou Chew also faced questions about the app's ties to China. Chew, who is Singaporean, repeatedly reminded senators that he is not Chinese and that TikTok is not available in mainland China (a very similar app called Douyin is). Chew also repeatedly denied any link between the Chinese Communist Party and TikTok, and insisted that content critical of China was permitted and regularly circulated on the platform.

Today, we're going to take a look at some arguments from the right and left about the hearings, then my take.


What the right is saying.

  • Many on the right think the time has come for Congress to intervene and regulate the business practices of social media companies.
  • Some criticize senators as being more interested in creating a spectacle than addressing the issue at hand. 
  • Others call for strict new regulations, like banning social media use for anyone under 18.

The New York Post editorial board said “social media CEOs won’t stop their products from harming kids until they’re forced to.”

“TikTok’s parent ByteDance pulled in $24.5 billion in revenue in just the first quarter of 2023, while Meta made more than $28 billion. So the ‘child safety’ spending is barely a drop in the bucket, even just compared to what they spend on tweaking their algorithms to boost the addictive factors,” the board wrote. “Most sites may have a minimum age of 13 to join, but verification is practically nonexistent, so younger children easily gain access all the time by simply ‘certifying’ that they’re old enough.That’s only ‘safety’ for the company — from lawsuits. As for the impact: Social-media sites are bottomless cesspools of content that drive anxiety, depression and body-image issues in America’s youth.”

“If any other product was causing such harm to children, it’d get yanked off the market or at least regulated heavily. Big Tobacco played dumb over the impact of its products for decades, even while maximizing addictive properties, too. How are TikTok and Meta any better? Actually, they’re worse: They deliver their dopamine-rushes for free. We’re not often fans of government regulation, but social-media giants have shown again and again they can’t be trusted. They’ll choose the path of the most profit, no matter the impact on mental health or society. A hearing is not enough. Congress and the administration need to get serious about actual penalties for bad behavior.”

In The American Spectator, Aubrey Gulick wrote “yelling at CEOs doesn‘t protect kids.”

The hearing “turned into a perfect opportunity for the politicians involved to denounce tech CEO’s — to ‘hold them accountable’ — without actually doing anything about the problem at hand,” Gulick said. “The heart of the issue is that parents are the ones handing their children smart phones, not Big Tech CEOs. The first line of defense when it comes to protecting kids from online exploitation by predators and from harmful social media posts is parents.

“Social media platforms should be the second line of defense when it comes to protecting kids. Meta, X, TikTok, Snapchat, and Discord should absolutely take responsibility for and work to prevent sexual predators on their platforms. Regulation encouraging them to do so could even be good. Unfortunately, protecting kids on social media is not as easy as yelling at Zuckerberg and his ilk, as fun as that might be.”

In National Review, Rich Lowry argued “kids shouldn’t be on social media at all.”

“Congress should press the brakes on the revolution that has given Mark Zuckerberg and other tech titans an outsized role in raising our kids and require that users of social media be age 18 or older. Surely, it’s not too much to ask that Zuckerberg and Co. make their fortunes exclusively off adults,” Lowry said. “Congress has already imposed an age limit, just in the wrong place. The Children’s Online Privacy Protection Act prevents the companies from collecting personal information from children under age 13, effectively prohibiting them from social media. But 13 draws the line much too young.”

“Let’s say the research and everyone’s intuition is wrong, and social media aren’t driving worse outcomes for kids. What’s the harm in staying off social media until they’re older? That kids will miss out on the latest absurd and perhaps dangerous TikTok trend? That they won’t get to envy people posting photos on Instagram to make themselves look more interesting and beautiful than they really are? That they will talk to their families and friends more and engage in more activities in the real world?,” Lowry asked. “Once every teen isn’t on social media, it becomes easier to stop teens from using social media.”


What the left is saying.

  • The left is troubled by the effects of social media on kids and calls for regulation to address the issue.
  • Some argue the reforms these companies are enacting on their own fall well short. 
  • Others doubt that any meaningful change will come out of the hearing.

The St. Louis Post-Dispatch editorial board said “protecting kids online should be both technologically and politically possible.”

“The hearing showed how much bipartisan agreement there is on the particular urgency of combating online child sexual exploitation, revenge porn, social media harassment and other scourges that have made childhood a more treacherous landscape than it was before the digital age,” the board wrote. “Unlike much of the overblown hysteria about, for example, the supposed censorship of conservative opinion on the internet, child sexual exploitation and related threats are real. And both political parties are increasingly insisting that the tech platforms have an obligation to deal with them.”

“To the argument that fully filtering out even just dangers to kids would be an impossibly huge order for the platforms, we would counter by noting the amazing things social media companies can do today. Advanced algorithms, artificial intelligence and other mind-blowing developments indicate there’s virtually no technological goal these tech titans can’t achieve when properly motivated. And now, as before there even was an internet, nothing motivates entrepreneurs like a threat to their bottom line.”

In CNN, Kara Alaimo wrote “Zuckerberg’s extraordinary apology should only be the beginning.”

“In the leadup to the testimony, tech companies announced new initiatives to protect kids… But it’s not enough. Lawmakers and tech companies need to do much more to protect our kids,” Alaimo said. “While they may claim they’re showing kids less harmful posts, it’s still unclear how they’re identifying what’s harmful. Tech companies should work with experts in adolescent health to develop and share standards for content that can be shown to kids so that potentially harmful posts about things like body image and mental health don’t appear on their feeds at all, and so content creators know what the rules are and can try to follow them.”

“Tech executives promised to protect kids in their testimony to senators. But they didn’t promise to do what’s actually needed to safeguard kids’ physical and mental well-being. To protect kids on their apps, they need to create and enforce better standards for content shown to kids, along with more human moderators, mental health resources, lessons for kids and disclosures when content has been manipulated. And lawmakers need to pass legislation to crack down on online sexual exploitation. These kinds of solutions would give parents something to actually like.”

In The Washington Post, Adam Lashinsky described the result of the hearing — and of others before it — as “nothing.”

“It all stacked up as another reminder that government cannot ride herd on industry anymore. At least cigarette smokers have long been warned by the U.S. surgeon general that smoking is dangerous. Last year, the surgeon general issued an advisory linking social media with mental health concerns. Do the hardware and software that have all sorts of repercussions in our lives (and that of our children) come with any warnings or alerts?,” Lashinsky asked. 

“Not to date. And that’s usually the only takeaway whenever Silicon Valley comes to Washington: nothing. Congress, and President Biden too, have been vocal about the problems with social media while doing approximately nothing about them. Repeated efforts to weaken liability shields that benefit big tech companies as well as measures aimed specifically at protecting children have failed to pass Congress despite bipartisan support.”


My take.

Reminder: "My take" is a section where I give myself space to share my own personal opinion. If you have feedback, criticism, or compliments, don't unsubscribe. Write in by replying to this email, or leave a comment.

  • Social media usage can undoubtedly be harmful for teenagers, though the danger varies depending on the platform and usage.
  • This is another area where I don’t support heavy-handed government intervention.
  • A good awareness campaign, paired with small adjustments to existing legislation, would go a long way.

I don't think there is any real doubt about the harmful effects of social media on teenagers.

Meta's own internal investigations have shown the harm its platforms have on teen girls. Instagram in particular has made girls feel bad about their bodies, increased their anxiety and depression, and often led to suicidal thoughts. These kinds of experiences come just from using the apps — they are separate from the other threats around these websites, like predatory users, bullying, or sexual exploitation.

Their studies have also concluded that Instagram specifically heightens unhealthy "social comparison," unlike TikTok, which is centered around performance, and Snapchat, which promotes talking and inside jokes (but also makes teens vulnerable to bullying).

These kinds of differences are instructive in that they should remind legislators that no two apps are the same, and the risks from their use can be very different. TikTok, for instance, can be a bastion of misinformation, while Instagram is a place where a teenager might develop an eating disorder (of course neither app has a monopoly on those issues, but the research has fleshed out real differences between them).

When I've written about mass shootings, I've often talked about the blame pyramid, starting at the top then working down: The shooter who makes the decision to inflict mass harm; the people around them who see the warnings but do nothing; the failures of law enforcement to address a threat when it is reported; the laws in place that fail to give proper tools for preventing mass violence.

I think a similar blame pyramid can be constructed here. In this case, most people agree that minors who end up the victims of exploitation or develop anxiety and depression are the least culpable, as they are just kids trying to find their way (minors who use these apps as tools of bullying certainly exist somewhere on the pyramid). Instead, more blame can go to the incredibly influential social media personalities who create dangerous content; the parents, who are fundamentally responsible for the things their kids consume; the platforms, who need to make algorithmic changes and implement policy to keep minor users safe; the legislators, who have to think about ways to address this with a soft touch.

As for the actions that should follow these hearings, I’ll be direct: I don't think Congress should pursue a heavy-handed governmental approach here. I've written before about my support for Section 230 and the necessity of an open and free internet. Some of Congress's solutions, like the Kids Online Safety Act, might be getting bipartisan support — but I think they constitute major overreaches and put dangerously broad limits on internet companies. That's why digital rights groups like the Electronic Frontier Foundation (EFF) have come out strongly against them.

Instead, there are two major changes I think Congress can and should try to push: First is a disclosure on targeted advertising, which forces platforms to inform a minor why they are being shown a particular advertisement (and, frankly, should also happen for adults). This is a change supported by EFF that also has a powerful effect of pulling the curtain back on how these platforms work. Second is to enact small changes to Section 230 that actually give plaintiffs pathways to sue major companies if they are able to prove a direct connection between a criminal act and the promotion of criminal behavior by the platform.

Congress does have a role here, but it should not be a major one. If the government is going to invest in anything, it should spend money on continuing to raise awareness (for parents and teens) about the threats posed by these apps. 34% of teenagers believe these platforms have a mostly negative impact on their peers, and 9% feel the platforms are mostly harmful for themselves. That’s in part because these apps often make teens feel more connected to their friends and families — which is why they’re popular in the first place. Educating teenagers about the risks while emphasizing the ways in which these platforms can be used for good isn’t a bad use of resources, and already we are seeing cultural movements around less screen time take off.

The truth is, the best way to wrangle these companies is to force them to be more transparent, and make it harder for them to profit off of teenagers in exchange for making their platforms much more dangerous. However, the moment the federal government starts to enforce its view of what is or isn't harmful is the moment we start to lose the necessary freedoms of the internet.

Disagree? That's okay. My opinion is just one of many. Write in and let us know why, and we'll consider publishing your feedback.


Your questions, answered.

Q: We have heard a lot about the 'deep state'. Is there any evidence to point to this actually existing? With the limited mental capacity of Joe Biden, I can imagine that someone or some other group is truly in power and calling the shots.

— Sandra from Trophy Club, Texas

Tangle: It depends on what you mean by “the deep state.” If you’re asking if there is actually a shadow government operating behind the scenes, conspiring among the elites to pull all the puppet strings to its desire (and that maybe they are all also satan-worshiping pedophiles), then no. And if you think the government is bloated and incompetent, but also somehow capable of executing and concealing massive conspiracies or organizing false flag events, then you should dump one of those viewpoints. If you’re stuck between which one, remember Hanlon’s Razor: That which can be attributed to malice is more adequately explained by stupidity.

That doesn’t mean a “deep state” of some kind doesn’t exist. Without the sinister overtones, you could just call it “the bureaucracy.” There are dozens of federal departments, all containing their own agencies, overseeing nearly 2 million civil servants. Overseeing, managing, and leading that massive operation requires a lot of effort and diplomacy, and sometimes some government employees are going to resist the person in charge. Then there’s the judiciary and legislature, which were designed to check the power of the executive branch (and one another). Add in state and local governments, then heads of cable news networks with their own agendas and media personalities selling their own worldview, and maybe you get something close to what you are imagining.

Do I think these people in power (i.e. corporate elites) meet up and share ideas and sometimes work in concert to achieve certain goals? Of course. Do I think they all agree and have a unified vision for the future and are very good at accomplishing these goals? Definitely not.  

The government is huge and there isn’t one group or one person secretly in charge.  What’s more accurate is that there are dozens and dozens of power players inside and outside of government who are all vying for position and scoring little wins for their agendas in a giant system composed of hundreds and thousands of judges, wealthy donors, state senators, corporate executives, mayors, media influencers, and heads of agencies all seeing things their own way. Not being able to snap all those people to attention isn’t the result of a secret conspiracy, it’s just a facet of a large, complex democracy. And it’s a feature, not a bug.

That’s the real deep state.

Want to have a question answered in the newsletter? You can reply to this email (it goes straight to my inbox) or fill out this form.


Under the radar.

Joshua Schulte, a 35-year-old former CIA agent, was sentenced to 40 years in prison after being found guilty of espionage, computer hacking, contempt of court, making false statements to the FBI, and possessing child pornography. Schulte was convicted of being behind the classified "Vault 7 leak," which was disclosed by Wikileaks in 2017 and revealed how the CIA hacked smartphones in spying operations overseas. The leaks also showed the CIA's efforts to turn internet-connected TVs into recording devices. Schulte helped create the hacking tools while at the CIA. He was later convicted of downloading more than 10,000 files of child pornography on his computer. USA Today has the story.


Numbers.

  • 95%. The percentage of U.S. teens (age 13-17) who use social media, according to a 2023 U.S. Surgeon General’s advisory. 
  • 93%.The percentage of U.S. teens who say they use YouTube, according to a 2023 survey by Pew Research.
  • 63%. The percentage of U.S. teens who say they use TikTok. 
  • 59%. The percentage of U.S. teens who say they use Instagram.
  • 33%. The percentage of U.S. teens who say they use Facebook.
  • 1 in 5. The approximate number of U.S. teens who say they use YouTube or TikTok “almost constantly.”
  • 79%. The percentage of U.S. parents with at least one child under 18 who would support legislation that would require parental approval for children under 16 to download apps, according to a 2023 survey by Morning Consult.

The extras.

  • One year ago today we had just published a Friday edition responding to reader criticism.
  • The most clicked link in Thursday’s newsletter was the ATM fish bandit.
  • Strike back: 817 readers responded to our survey asking what their preferred response would be to the attack which killed U.S. troops in Jordan with 56% saying the U.S. should strike Iranian proxies. 10% said the U.S. should strike Iran directly, 15% said the U.S. should not make any strikes, and 8% said the U.S. should withdraw its troops from the region. 11% were unsure or had no opinion. “Strikes should be made to the proxies responsible for the attacks to US troops, a non-response would enable others to commit the same kind of strikes thinking there would be no consequences for doing so,” one respondent said.
  • Nothing to do with politics: Punxsutawney Phil has predicted an early spring.
  • Take the poll. Who would you say is the most responsible for mental health issues among minors using social media apps? Let us know!

Have a nice day. 

We recently shared a story in our Sunday newsletter about drug issues in the Kensington neighborhood of Philadelphia, which prompted a response from a Tangle reader. Scott Shackleton’s grandparents immigrated to the United States and settled in Kensington. And last year, Scott got back in touch with his roots to help, partnering with a Philadelphia organization called Simple Homes to purchase an abandoned home for $1, raise $30k to renovate it, then donate the home back to the community. “It is now owned by a woman raising her niece and nephew after their mother died of an overdose. She manages the food pantry for Simple Way and feeds over 90 families,” Scott said. Simple Homes has their story.


Don't forget...

📣 Share Tangle on Twitter here, Facebook here, or LinkedIn here.

🎥 Follow us on Instagram here or subscribe to our YouTube channel here

💵 If you like our newsletter, drop some love in our tip jar.

🎉 Want to reach 90,000+ people? Fill out this form to advertise with us.

📫 Forward this to a friend and tell them to subscribe (hint: it's here).

🛍 Love clothes, stickers and mugs? Go to our merch store!

Subscribe to Tangle

Join 100,000+ people getting Tangle directly to their inbox!

Isaac Saul
I'm a politics reporter who grew up in Bucks County, PA — one of the most politically divided counties in America. I'm trying to fix the way we consume political news.