Snapchat is sued over its alleged use by child sex predators

Placeholder while article actions load

She was 12 when he started demanding nude photos, saying she was pretty, that he was her friend. She believed, because they had connected on Snapchat, that her photos and videos would disappear.

Now, at 16, she is leading a class-action lawsuit against an app that has become a mainstay of American teen life — claiming its designers have done almost nothing to prevent the sexual exploitation of girls like her.

Her case against Snapchat reveals a haunting story of shame and abuse inside a video-messaging app that has for years flown under lawmakers’ radar, even as it has surpassed 300 million active users and built a reputation as a safe space for young people to trade their most intimate images and thoughts.

But it also raises difficult questions about privacy and safety, and it throws a harsh spotlight on the tech industry’s biggest giants, arguing that the systems they depend on to root out sexually abusive images of children are fatally flawed.

“There isn’t a kid in the world who doesn’t have this app,” the girl’s mother told The Washington Post, “and yet an adult can be in correspondence with them, manipulating them, over the course of many years, and the company does nothing. How does that happen?”

In the lawsuit, filed Monday in a California federal court, the girl — requesting anonymity as a victim of sexual abuse and referred to only as L.W. — and her mother accuse Snapchat of negligently failing to design a platform that could protect its users from “egregious harm.”

The man — an active-duty Marine who was convicted last year of charges related to child pornography and sexual abuse in a military court — saved her Snapchat photos and videos and shared them with others around the Web, a criminal investigation found.

A bill aiming to protect children online reignites a battle over privacy and free speech

Snapchat’s parent company, Snap, has defended its app’s core features of self-deleting messages and instant video chats as helping young people speak openly about important parts of their lives.

In a statement to The Post, the company said it employs “the latest technologies” and develops its own software “to help us find and remove content that exploits or abuses minors.”

“While we cannot comment on active litigation, this is tragic, and we are glad the perpetrator has been caught and convicted,” Snap spokeswoman Rachel Racusen said. “Nothing is more important to us than the safety of our community.”

Founded in 2011, the Santa Monica, Calif., company told investors last month that it now has 100 million daily active users in North America, more than double Twitter’s following in the United States, and that it is used by 90 percent of U.S. residents aged 13 to 24 — a group it designated the “Snapchat Generation.”

For every user in North America, the company said, it received about $31 in advertising revenue last year. Now worth nearly $50 billion, the public company has expanded its offerings to include augmented-reality camera glasses and auto-flying selfie drones.

But the lawsuit likens Snapchat to a defective product, saying it has focused more on innovations to capture children’s attention than on effective tools to keep them safe.

The app relies on “an inherently reactive approach that waits until a child is harmed and places the burden on the child to voluntarily report their own abuse,” the girl’s lawyers wrote. “These tools and policies are more effective in making these companies wealthier than [in] protecting the children and teens who use them.”

Snap’s new Pixy drone is a $230 selfie machine

Apple and Google are also listed as defendants in the case because of their role in hosting an app, Chitter, that the man had used to distribute the girl’s images. Both companies said they removed the app Wednesday from their stores following questions from The Post.

Apple spokesman Fred Sainz said in a statement that the app had repeatedly broken Apple’s rules around “proper moderation of all user-generated content.” Google spokesman José Castañeda said the company is “deeply committed to fighting online child sexual exploitation” and has invested in techniques to find and remove abusive content. Chitter’s developers did not respond to requests for comment.

The suit seeks at least $5 million in damages and assurances that Snap will invest more in protection. But it could send ripple effects through not just Silicon Valley but Washington, by calling out how the failures of federal lawmakers to pass tech regulation have left the industry to police itself.

“We cannot expect the same companies that benefit from children being harmed to go and protect them,” Juyoun Han, the girl’s attorney, said in a statement. “That’s what the law is for.”

Brian Levine, a professor at the University of Massachusetts at Amherst who studies children’s online safety and digital forensics and is not involved in the litigation, said the legal challenge adds to the evidence that the country’s lack of tech regulation has left young people at risk.

“How is it that all of the carmakers and all of the other industries have regulations for child safety, and one of the most important industries in America has next to nothing?” Levine said.

“Exploitation results in lifelong victimization for these kids,” and it’s being fostered on online platforms developed by “what are essentially the biggest toymakers in the world, Apple and Google,” he added. “They’re making money off these apps and operating like absentee landlords. … After some point, don’t they bear some responsibility?”

While most social networks focus on a central feed, Snapchat revolves around a user’s inbox of private “snaps” — the photos and videos they exchange with friends, each of which self-destructs after being viewed.

The simple concept of vanishing messages has been celebrated as a kind of anti-Facebook, creating a low-stakes refuge where anyone can express themselves as freely as they want without worrying how others might react.

Snapchat, in its early years, was often derided as a “sexting app,” and for some users the label still fits. But its popularity has also solidified it as a more broadly accepted part of digital adolescence — a place for joking, flirting, organizing and working through the joys and awkwardness of teenage life.

In the first three months of this year, Snapchat was the seventh-most-downloaded app in the world, installed twice as often as Amazon, Netflix, Twitter or YouTube, estimates from the analytics firm Sensor Tower show. Jennifer Stout, Snap’s vice president of global public policy, told a Senate panel last year that Snapchat was an “antidote” to mainstream social media and its “endless feed of unvetted content.”

Snapchat photos, videos and messages are designed to automatically vanish once the recipient sees them or after 24 hours. But Snapchat’s carefree culture has raised fears that it’s made it too easy for young people to share images they may one day regret.

Snapchat allows recipients to save some photos or videos within the app, and it notifies the sender if a recipient tries to capture a photo or video marked for self-deletion. But third-party workarounds are rampant, allowing recipients to capture them undetected.

Facebook hits pause on Instagram Kids app amid growing scrutiny

Parent groups also worry the app is drawing in adults looking to prey on a younger audience. Snap has said it accounts for “the unique sensitivities and considerations of minors” when developing the app, which now bans users younger than 18 from posting publicly in places such as Snap Maps and limits how often children and teens are served up as “Quick Add” friend suggestions in other users’ accounts. The app encourages people to talk with friends they know from real life and only allows someone to communicate with a recipient who has marked them as a friend.

The company said that it takes fears of child exploitation seriously. In the second half of 2021, the company deleted roughly 5 million pieces of content and nearly 2 million accounts for breaking its rules around sexually explicit content, a transparency report said last month. About 200,000 of those accounts were axed after sharing photos or videos of child sexual abuse.

But Snap representatives have argued they’re limited in their abilities when a user meets someone elsewhere and brings that connection to Snapchat. They’ve also cautioned against more aggressively scanning personal messages, saying it could devastate users’ sense of privacy and trust.

Some of its safeguards, however, are fairly minimal. Snap says users must be 13 or older, but the app, like many other platforms, doesn’t use an age-verification system, so any child who knows how to type a fake birthday can create an account. Snap said it works to identify and delete the accounts of users younger than 13 — and the Children’s Online Privacy Protection Act, or COPPA, bans companies from tracking or targeting users under that age.

Facebook paid GOP firm to malign TikTok

Snap says its servers delete most photos, videos and messages once both sides have viewed them, and all unopened snaps after 30 days. Snap said it preserves some account information, including reported content, and shares it with law enforcement when legally requested. But it also tells police that much of its content is “permanently deleted and unavailable,” limiting what it can turn over as part of a search warrant or investigation.

In 2014, the company agreed to settle charges from the Federal Trade Commission alleging Snapchat had deceived users about the “disappearing nature” of their photos and videos, and collected geolocation and contact data from their phones without their knowledge or consent.

Snapchat, the FTC said, had also failed to implement basic safeguards, such as verifying people’s phone numbers. Some users had ended up sending “personal snaps to complete strangers” who had registered with phone numbers that weren’t actually theirs.

A Snapchat representative said at the time that “while we were focused on building, some things didn’t get the attention they could have.” The FTC required the company submit to monitoring from an “independent privacy professional” until 2034.

Like many major tech companies, Snapchat uses automated systems to patrol for sexually exploitative content: PhotoDNA, built in 2009, to scan still images, and CSAI Match, developed by YouTube engineers in 2014, to analyze videos.

The systems work by looking for matches against a database of previously reported sexual-abuse material run by the government-funded National Center for Missing and Exploited Children (NCMEC).

But neither system is built to identify abuse in newly captured photos or videos, even though those have become the primary ways Snapchat and other messaging apps are used today.

When the girl began sending and receiving explicit content in 2018, Snap didn’t scan videos at all. The company started using CSAI Match only in 2020.

In 2019, a team of researchers at Google, the NCMEC and the anti-abuse nonprofit Thorn had argued that even systems like those had reached a “breaking point.” The “exponential growth and the frequency of unique images,” they argued, required a “reimagining” of child-sexual-abuse-imagery defenses away from the blacklist-based systems tech companies had relied on for years.

They urged the companies to use recent advances in facial-detection, image-classification and age-prediction software to automatically flag scenes where a child appears at risk of abuse and alert human investigators for further review.

“Absent new protections, society will be unable to adequately protect victims of child sexual abuse,” the researchers wrote.

Three years later, such systems remain unused. Some similar efforts have also been halted due to criticism they could improperly pry into people’s private conversations or raise the risks of a false match.

Opinion: We built a system like Apple’s to flag child sexual abuse material — and concluded the tech was dangerous

In September, Apple indefinitely postponed a proposed system — to detect possible sexual-abuse images stored online — following a firestorm that the technology could be misused for surveillance or censorship.

But the company has since released a separate child-safety feature designed to blur out nude photos sent or received in its Messages app. The feature shows underage users a warning that the image is sensitive and lets them choose to view it, block the sender or to message a parent or guardian for help.

Privacy advocates have cautioned that more-rigorous online policing could end up penalizing kids for being kids. They’ve also worried that such concerns could further fuel a moral panic, in which some conservative activists have called for the firings of LGBTQ teachers who discuss gender or sexual orientation with their students, falsely equating it to child abuse.

But the case adds to a growing wave of lawsuits challenging tech companies to take more responsibility for their users’ safety — and arguing that past precedents should no longer apply.

The companies have traditionally argued in court that one law, Section 230 of the Communications Decency Act, should shield them from legal liability related to the content their users post. But lawyers have increasingly argued that the protection should not inoculate the company from punishment for design choices that promoted harmful use.

In one case filed in 2019, the parents of two boys killed when their car smashed into a tree at 113 mph while recording a Snapchat video sued the company, saying its “negligent design” decision to allow users to imprint real-time speedometers on their videos had encouraged reckless driving.

A California judge dismissed the suit, citing Section 230, but a federal appeals court revived the case last year, saying it centered on the “predictable consequences of designing Snapchat in such a way that it allegedly encouraged dangerous behavior.” Snap has since removed the “Speed Filter.” The case is ongoing.

In a separate lawsuit, the mother of an 11-year-old Connecticut girl sued Snap and Instagram parent company Meta this year, alleging she had been routinely pressured by men on the apps to send sexually explicit photos of herself — some of which were later shared around her school. The girl killed herself last summer, the mother said, due in part to her depression and shame from the episode.

Mother of 11-year-old who died by suicide sues social media firms Meta and Snap

Congress has voiced some interest in passing more-robust regulation, with a bipartisan group of senators writing a letter to Snap and dozens of other tech companies in 2019 asking about what proactive steps they had taken to detect and stop online abuse.

But the few proposed tech bills have faced immense criticism, with no guarantee of becoming law. The most notable — the Earn It Act, which was introduced in 2020 and passed a Senate committee vote in February — would open tech companies to more lawsuits over child-sexual-abuse imagery, but technology and civil rights advocates have criticized it as potentially weakening online privacy for everyone.

Some tech experts note that predators can contact children on any communications medium and that there is no simple way to make every app completely safe. Snap’s defenders say applying some traditional safeguards — such as the nudity filters used to screen out pornography around the Web — to personal messages between consenting friends would raise its own privacy concerns.

But some still question why Snap and other tech companies have struggled to design new tools for detecting abuse.

Hany Farid, an image-forensics expert at University of California at Berkeley, who helped develop PhotoDNA, said safety and privacy have for years taken a “back seat to engagement and profits.”

The fact that PhotoDNA, now more than a decade old, remains the industry standard “tells you something about the investment in these technologies,” he said. “The companies are so lethargic in terms of enforcement and thinking about these risks … at the same time, they’re marketing their products to younger and younger kids.”

Farid, who has worked as a paid adviser to Snap on online safety, said that he believes the company could do more but that the problem of child exploitation is industry-wide.

“We don’t treat the harms from technology the same way we treat the harms of romaine lettuce,” he said. “One person dies, and we pull every single head of romaine lettuce out of every store,” yet the children’s exploitation problem is decades old. “Why do we not have spectacular technologies to protect kids online?”

‘I thought this would be a secret’

The girl said the man messaged her randomly one day on Instagram in 2018, just before her 13th birthday. He fawned over her, she said, at a time when she was feeling self-conscious. Then he asked for her Snapchat account.

“Every girl has insecurities,” said the girl, who lives in California. “With me, he fed on those insecurities to boost me up, which built a connection between us. Then he used that connection to pull strings.” The Post does not identify victims of sexual abuse without their permission.

He started asking for photos of her in her underwear, then pressured her to send videos of herself nude, then more explicit videos to match the ones he sent of himself. When she refused, he berated her until she complied, the lawsuit states. He always demanded more.

She blocked him several times, but he messaged her through Instagram or via fake Snapchat accounts until she started talking to him again, the lawyers wrote. Hundreds of photos and videos were exchanged over a three-year span.

She felt ashamed, but she was afraid to tell her parents, the girl told The Post. She also worried what he might do if she stopped. She thought reporting him through Snapchat would do nothing, or that it could lead to her name getting out, the photos following her for the rest of her life.

“I thought this would be a secret,” she said. “That I would just keep this to myself forever.” (Snap officials said users can anonymously report concerning messages or behaviors, and that its “trust and safety” teams respond to most reports within two hours.)

Kids are flocking to Facebook’s ‘metaverse.’ Experts worry predators will follow.

Last spring, she told The Post, she saw some boys at school laughing at nude photos of young girls and realized it could have been her. She built up her confidence over the next week. Then she sat with her mother in her bedroom and told her what had happened.

Her mother told The Post that she had tried to follow the girl’s public social media accounts and saw no red flags. She had known her daughter used Snapchat, like all of her friends, but the app is designed to give no indication of who someone is talking to or what they’ve sent. In the app, when she looked at her daughter’s profile, all she could see was her cartoon avatar.

The lawyers cite Snapchat’s privacy policy to show that the app collects troves of data about its users, including their location and who they communicate with — enough, they argue, that Snap should be able to prevent more users from being “exposed to unsafe and unprotected situations.”

Stout, the Snap executive, told the Senate Commerce, Science and Transportation Committee’s consumer protection panel in October that the company was building tools to “give parents more oversight without sacrificing privacy,” including letting them see their children’s friends list and who they’re talking to. A company spokesman told The Post those features are slated for release this summer.

Thinking back to those years, the mother said she’s devastated. The Snapchat app, she believes, should have known everything, including that her daughter was a young girl. Why did it not flag that her account was sending and receiving so many explicit photos and videos? Why was no one alerted that an older man was constantly messaging her using overtly sexual phrases, telling her things like “lick it up?”

After the family called the police, the man was charged with sexual abuse of a child involving indecent exposure as well as the production, distribution and possession of child pornography.

At the time, the man had been a U.S. Marine Corps lance corporal stationed at a military base, according to court-martial records obtained by The Post.

As part of the Marine Corps’ criminal investigation, the man was found to have coerced other underage girls into sending sexually explicit videos that he then traded with other accounts on Chitter. The lawsuit cites a number of Apple App Store reviews from users saying the app was rife with “creeps” and “pedophiles” sharing sexual photos of children.

The man told investigators he used Snapchat because he knew the “chats will go away.” In October, he was dishonorably discharged and sentenced to seven years in prison, the court-martial records show.

The girl said she has suffered from guilt, anxiety and depression after years of quietly enduring the exploitation and has attempted suicide. The pain “is killing me faster than life is killing me,” she said in the suit.

Her mother said that the last year has been devastating, and that she worries about teens like her daughter — the funny girl with the messy room, who loves to dance, who wants to study psychology so she can understand how people think.

“The criminal gets punished, but the platform doesn’t. It doesn’t make sense,” the mother said. “They’re making billions of dollars on the backs of their victims, and the burden is all on us.”

Leave a Reply

Your email address will not be published. Required fields are marked *