The AG Putting Big Tech on Trial

Issa Bee is a 13-year-old girl from rural New Mexico. When her Facebook account was created, in August 2023, she listed a 2002 birthday to avoid the restrictions attached to a child’s account. But anybody could tell Issa was a young girl. She posted about the cafeteria and school bus, about loving Harry Styles, about sports tryouts and finishing seventh grade and losing her last baby tooth. Even Facebook’s algorithm seemed to recognize her age; most of the Reels it served her were from other teenage girls. 

Within a few days of signing up for Facebook, Issa amassed 5,000 friends and more than 6,700 followers. Nearly all were adult men, including many who followed her from abroad—Nigeria, Ghana, the Dominican Republic. These men reacted to Issa’s posts with comments like “absolutely gorgeous baby girl,” or emojis with heart eyes. In Issa’s Facebook Messenger inbox, men asked to connect with her on other platforms.  Some asked her to meet in person. “My interest has been piqued,” one adult user wrote. “I’m looking for a sugar baby to spoil.”  

Issa’s photo was shared on Facebook by a user who frequently shares photos of young girls in suggestive poses. By this point, Issa was receiving an average of three or four unsolicited photos of exposed penises in her Messenger inbox per week. Even after she reported those incidents to Facebook, the accounts that sent the images remained active (or were taken offline only briefly) and continued to post photos of genitalia, according to a legal complaint filed by the state of New Mexico. The Reels that Facebook served Issa included graphic sexual images of young girls, according to state officials, followed by advertisements for a law firm representing trafficking survivors. Given her account’s popularity, Facebook prompted Issa to set up a professional account to “let your fans support you by sending stars and gifts.” If her audience grew further, according to her “professional dashboard” on Facebook, Issa would be able to “unlock more ways to make money.” 

There’s one more thing to know about Issa Bee: she isn’t real. She’s an invention of the New Mexico Department of Justice, a so-called “sock puppet” account created in 2023 as part of an ongoing investigation into alleged child sexual abuse on Facebook and Instagram, which are both owned by Meta. As a result of that investigation, New Mexico Attorney General Raùl Torrez is suing Meta, alleging that it “has allowed Facebook and Instagram to become a marketplace for predators in search of children upon whom to prey.”

Torrez, a 49-year old career prosecutor with a youthful face beneath a salt-and-pepper beard, is an unlikely antagonist for Big Tech. His sparsely furnished office in New Mexico is far from the campuses of Silicon Valley or the lawmakers and regulators in Washington. Aside from a brief stint at a startup in his 20s, he has no connection to the tech industry. But he is waging a fierce crusade against the harms he says social-media companies are inflicting on minors. His office’s investigation into Meta, he says, reveals how tech giants neglected customer safety. “Warnings were disregarded, over and over and over again,” he tells me during a recent interview in his Albuquerque office. “It's a series of decisions that demonstrate a pattern of conduct that favors profit over safety.”

Read More: She Says Social Media Led To An Eating Disorder. Now She's Suing.

New Mexico’s suit argues Meta “knowingly exposes children to the twin dangers of sexual exploitation and mental health harm,” violating the state’s Unfair Practices Act and creating a public nuisance. The suit alleges that Meta’s recommendation algorithm has created a “marketplace” to “connect pedophiles, predators, and others engaged in the commerce of sex,” allowing them “to hunt for, groom, sell, and buy sex with children and sexual images of children at an unprecedented scale.” The case is slated for trial in February, when it would be one of the first major cases against social-media companies to be heard by a jury. 

Meta disputes New Mexico’s claims. “We want teens to have safe, age-appropriate experiences online, and have built a range of tools to support them and their parents,” a company spokesperson said in a statement to TIME. “Last year, we launched Teen Accounts, which fundamentally reimagined the teen experience on Instagram, placing them in automatic protections to limit who could contact them, the content they see, and how much time they spend on the app. We’ve spent a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online. The complaint mischaracterizes our work using selective quotes and cherry-picked documents.” 

Attorney General of New Mexico Raúl Torrez speaks during a rally to hold tech and social media companies accountable for taking steps to protect kids and teens online in Washington, DC. on Jan. 31, 2024. Jemal Countes—Accountable Tech/Getty Images

The case against Meta isn’t Torrez’s only challenge to Big Tech. Last year he sued Snap, the parent company of Snapchat, alleging the platform is “a breeding ground for predators to collect sexually explicit images of children.” According to New Mexico’s complaint in that matter, “Snap’s design–especially its focus on ephemeral content–is uniquely suited to facilitate illegal and illicit conduct and conversations. Snap’s algorithm serves up children to adult predators, and Snap Map lets them find them in the real world. Snap knows all of this.” 

A judge denied Snap’s motion to dismiss the case in April, and the case is currently in the discovery phase. “We are committed to creating a safe and enjoyable environment for our community, and we have incorporated privacy and safety features into our platform from the very beginning,” a Snap spokesperson said in a statement to TIME. “Unfortunately, the reality is this—there is no single safety feature or policy that can eliminate every potential threat online or in the world around us. This is why we continually adapt our strategies to fight against bad actors, and we remain committed to collaborating with law enforcement. However, rather than collaborating with Snap alongside law enforcement in New Mexico, the New Mexico Attorney General chose to pursue litigation based on clear misrepresentations.”  

It’s notoriously difficult to sue social-media companies. Section 230 of the Communications Decency Act grants platforms broad immunity from liability for content hosted on their sites. Yet in recent years, litigators and prosecutors have been suing tech companies over the design of their products, not the content they host, in an approach known as “product liability theory.” The idea is that even if these companies can’t be sued over third-party content, they are liable for alleged negligence in their product design, as well as for alleged misrepresentations to the public.  

Torrez did not invent the idea of using product liability law against social-media companies, but he has become one of its most aggressive champions. And while AGs around the country have joined together to sue companies like Meta and Snap, Torrez stands out for waging big solo battles against Big Tech. He has “taken the lead” in “recognizing the clear and present danger of sex abuse of minors that is posed by social-media platforms,” says Matthew Bergman, founder of the Social Media Victims’ Law Center, who helped pioneer the legal strategy of suing platforms over product liability. “These platforms are pimping children,” Bergman continues. “And he's treating them no differently than you would a run-of-the-mill child abuser.”

On a recent morning in late September, Torrez is sitting in his Albuquerque office, squeezing a purple-and-pink stress ball. Looming over a table is a massive Barack Obama Hope poster, which Torrez nabbed from the campaign’s Albuquerque offices during the 2008 campaign. As we chat, he offers a thought experiment to explain his approach to his job: what if what was happening on social media was happening in the physical world? How would the top law enforcement official in the state respond?

“If we knew that in that storefront”—he gestures to the building out the window—“they were selling products that connected prepubescent girls with adult men, very quickly law enforcement would be inside,” he tells me. “And yet, somehow, because it exists in the virtual world and on the Internet, we have created a whole different legal architecture. We treat the conduct differently. We distance the corporate executives from it in a way that we never would for a shopkeeper.”

Law enforcement is the Torrez family business; his father was a federal prosecutor in New Mexico for decades. After graduating from Harvard and earning a law degree from Stanford, Torrez returned to Albuquerque to be an assistant district attorney. His first major prosecution, in 2006, involved a “shaken baby” case. When Torrez told his father about the assignment, his dad’s first question was whether he had met the boy who had allegedly been injured. There was no legal reason for a prosecutor to do so—the toddler was too young to testify, and disabled from his injuries. But his father insisted, Torrez recalls, so that he could “understand the human cost” of the crime. When he met him, the victim “looked like a little boy that had been in my family,” he says. “It was a real turning point for me personally.” He went on to win a conviction of the boy’s father for child abuse. 

Torrez started volunteering to take on more child-abuse cases. After becoming the state’s assistant Attorney General, he served as an Internet Crimes Against Children prosecutor, spending three years working on sexual abuse and child-pornography cases. As part of those investigations, his team combed through thousands of terabytes of abhorrent pictures. Torrez had to review the images and chats personally. “I have seen things that never really go away,” he says. “It opened my eyes to a scale of potential harm for kids that I think most people are largely oblivious to.” 

Calif. Representative (D) Lou Correa, activist Raul Aguirre, and Raul Torrez, speak during a Latino Men for Harris cafecito at REA Media Group in Tucson, Ariz. on Oct. 12, 2024. Rebecca Noble—Bloomberg/Getty Images

When Torrez, a Democrat, was elected New Mexico Attorney General in 2022, the technology had changed but the abuse hadn’t. “The dangers that were distributed in these dark corners of the Internet were now available with the click of a button, on devices that literally live in the pockets of almost everyone you know,” he says. “So when I came to this role and I realized that these companies were facilitating some of that behavior? It enraged me.” 

In 2023, his office began investigating Meta. Over the course of the probe, his team uncovered what it describes as rampant child sexual-abuse material on its platforms. Predators sometimes advertise child sexual-abuse content under the code “cheese pizza,” which has the same initials as child pornography, prosecutors say. One Instagram account documented in the complaint features a photo of a young girl with a cheese pizza, with posts advertising “small girls, small boys… 3 years to 12 years.” Similar accounts offered “graphic images of children along with adult genitalia and of sexual intercourse involving children,” according to the complaint. Of all the child sexual-abuse content Torrez’s office reported to Meta over the course of the ongoing investigation, roughly half was still available online days before the state filed its complaint in Feb. 2024.

Read More: Everything I Learned About Suicide, I Learned On Instagram.'

Facebook employees were aware of the scope of the issue as early as 2018, according to Torrez’s suit. New Mexico’s complaint alleges that Facebook employees presented executives with analysis suggesting predators were being introduced to children through the People You May Know feature, an algorithmic feature that allowed adults to search for a particular type of child—like gymnasts, or kids in their area—in what the prosecutors describe as a “virtual victim identification service.” According to the complaint, executives resisted the Community Integrity Team’s recommendation to adjust the feature’s design to avoid recommending minors to unconnected adults. According to the complaint, Guy Rosen, then a Vice President of Integrity at Facebook and now the Chief Information Security Officer at Meta, wrote in a 2018 email that the company does not scan messages for violations like grooming or solicitation and warned that without action, Facebook and Instagram “are basically massive ‘victim discovery services.’”

In July 2020, according to the complaint, Meta employees circulated a document entitled “Child Safety– State of Play,” which contained a list of “immediate product vulnerabilities,” including “unconnected adults being able to find and message minors” and “implicit sexualization of minors” and “sex trafficking and sexual solicitation networks on Instagram.” According to the complaint, these employees wrote that the company’s efforts to “block reach of unconnected adults to IG minors” had “not been prioritized and more resources are needed.”

New Mexico’s complaint also cites an internal chat between Meta employees from that same month in which one unnamed employee asked another: “what specifically are we doing for child grooming (something I just heard about that is happening a lot on TikTok)?” 

“Somewhere between zero and negligible,” the other employee replied, according to the complaint. “Child safety is an explicit non-goal this half.” By 2021, according to the complaint, an internal company presentation suggested that more than 100,000 children per day received online sexual harassment such as photos of genitalia. 

Meta did not offer specific rebuttals to every claim in the suit when TIME requested comment, and declined to provide comments from Mosseri or Rosen. A Meta spokesperson said the company reports apparent child sexual-abuse material to the National Center for Missing & Exploited Children (NCMEC) tip line when they become aware of it, and is a founding member of Lantern, a new program that enables tech companies to share signals about predatory accounts and behavior. In 2023, the company also supported NCMEC’s creation of Take It Down, a new platform designed to help teens stop the spread of intimate images online.

Read More: Instagram Promised To Become Safer For Teens. Researchers Say It's Not Working.

The Meta spokesperson said the company has a robust policy against child nudity, and shows “safety notices” when teens are chatting with accounts that have shown potentially suspicious behavior. In June 2025 alone, the company says, teens have blocked 1 million accounts and reported another 1 million after seeing a safety notice. The spokesperson said Meta now restricts adults from starting chats with teens who aren’t already connected to them, and uses technology to identify adult accounts with potentially suspicious behavior and prevent them from finding accounts that primarily feature children. 

In 2024, Meta unveiled Teen Accounts, which the company says prevents teens from messaging with adult strangers, among many other protections for young users. Two new reports have suggested these protections are flawed and adult strangers are still able to reach kids on the platform. Meta disputes the methodologies behind them.

Several floors below Torrez’s office, New Mexico DOJ investigators sit in a room they call “the dungeon.” It’s a windowless space full of computer screens and buzzing servers, lined with photos of child molesters they’ve arrested and convicted. On the floor under one of the desks lies Nyx, a 2-year-old yellow lab who is trained to detect electronic-storage devices when the team goes to make an arrest. If investigators think suspects are hiding data on a USB, a micro-card, or on a hidden hard drive, Nyx can sniff it out. Last year, these investigators arrested three child predators in “Operation Metaphile,” after a monthslong undercover investigation of sexual solicitation on Meta’s platforms. All three were later convicted.

While Torrez’s investigators comb through mountains of online pornography to target individual predators, the AG is focused on broadening his office’s efforts to target the platforms he believes enable their crimes. “All right, guys, what do you got?” he says as he plops down at a large table, around which he’s gathered policy analysts from his office for a series of meetings about alleged illicit activity on social media—including firearm and narcotic sales—for the next phase of his work.

Torrez asks them for data specific to New Mexico. He wants to get experts on the phone from state universities. “I want to know where the crime guns are coming from, with as much clarity as possible: this percent is stolen, this percent is a straw purchase, and this percent is bought online,” he tells a special agent on his team. “I want to know exactly how many guns that result in criminal offenses on the street are bought on these platforms.” 

Torrez’s office can only enforce laws; he can’t make them. Which is why he’s also in close contact with state legislators about how to strengthen New Mexico statutes against some of these crimes. “I want to give the legislators a menu of options,” he says. “And then be able to give them a sense of the land mines they’re going to run into.” 

The harms kids experience on social media, he says, are part of a long history of corporate negligence that can only be fixed by persistent litigation. He compares this work to fighting for workers’ rights, or suing polluters, or the tobacco litigation of the 1990s. “The more successful they are, the more power they get,” he says. “And then at a certain point, someone has to step in and say: No more.” 

It’s all hands on deck, Torrez says, because technology is moving faster than law enforcement or regulators. As prosecutors focus on social-media harms from the last decade, tech companies are moving forward with new advancements in artificial intelligence. “The technology’s outpacing the law,” he says. He’s just trying to keep up. 

Leave a Comment