Salta al contenuto principale


Lawsuit Accuses a16z of Turning Roblox Into a School Shooter's Playground


The family of a dead teen girl said she'd still be alive if Roblox did a better job moderating its platform.

The mother of a teenager who died by suicide is suing Roblox, accusing the company of worrying more about its investors than the children in its audience. The complaint, filed this month, claims Kleiner Perkins and Andreessen Horowitz, who’ve collectively invested hundreds of millions of dollars into the gaming company, fostered a platform that monetizes children at the cost of their safety.
playlist.megaphone.fm?p=TBIEA2…
Attorneys for Jaimee Seitz filed the lawsuit in the eastern district of Kentucky. Seitz is the mother of Audree Heine, a teen girl who committed suicide just after her 13th birthday in 2024. When detectives investigated Heine’s death they found she had a vast online social life that centered around groups in Discord and Roblox that idolized school shooters like Dylan Kleebold. Since Heine’s death, Seitz has been outspoken about the unique dangers of Roblox.

Heine’s family claims she would never have died had Roblox done a better job of moderating its platform. “Audree was pushed to suicide by an online community dedicated to glorifying violence and emulating notorious mass shooters, a community that can thrive and prey upon young children like Audree only because of Defendants’ egregiously tortious conduct,” the complaint said.

Seitz’s lawyers filed the 89 page lawsuit on October 20 and in it attempted to make the case that Roblox’s problems all stem from cause: corporate greed. “The reason that Roblox is overrun with harmful content and predators is simple: Roblox prioritizes user growth, revenue, and eventual profits over child safety,” it said. “For years, Roblox has knowingly prioritized these numbers over the safety of children through the actions it has taken and decisions it has made to increase and monetize users regardless of the consequences.”

According to the lawsuit, Roblox’s earning potential attracted big investors which encouraged it to abandon safety for quick cash. “Roblox’s business model allowed the company to attract significant venture capital funding from big-name investors like Kleiner Perkins and Andreessen Horowitz, putting enormous pressure on the company to prioritize growing and monetizing its users.”

Andreessen Horowitz, known as a16z is a venture capital firm whose previous investments include Civitai—a company that made money from noncensual AI porn—an “uncensored” AI project that offered users advice on how to commit suicide, and startup that’s selling access to thousands of “synthetic influencers” for use in manipulating public opinion.

In 2020, a16z led a round of funding that raised $150 million for Roblox. “Roblox is one of those rare platform companies with massive traction and an organic, high-growth business model that will advance the company, and push the industry forward for many years to come,” David George, a general partner at the investment firm, said in a press release at the time.

The lawsuit claims Roblox knows that kids are easy marks for low effort monetization efforts common in online video games. “Recognizing that children have more free time, underdeveloped cognitive functioning, and diminished impulse control, Roblox has exploited their vulnerability to lure them to its app,” it said.

The lawsuit notes that Roblox did not require age verification for years, nor did it restrict communication between children and adults and didn’t require an adult to set up an account for a child. Roblox rolled out age verification and age-based communications systems in July, a feature that uses AI to scan the faces of its users to check their age.

These kinds of basic safety features, however, have taken years to implement. According to the lawsuit, there’s a reason Roblox has been slow on safety. “In pursuit of growth, Roblox deprioritized safety measures even further so that it could report strong numbers to Wall Street,” it said. “For instance, Roblox executives rejected employee proposals for parental approval requirements that would protect children on the platform. Employees also reported feeling explicit pressure to avoid any changes that could reduce platform engagement, even when those changes would protect children from harmful interactions on the platform.”

Roblox is now the subject of multiple investigative reports that have exposed the safety problems on its platforms. It’s also the subject of multiple lawsuits, Seitz’s is the 12th such case filed by Anapol Weiss, the law firm representing her.

According to Seitz’s interviews with the press and the lawsuit, her daughter got caught up in a subculture on Roblox and Discord called The True Crime Community (TCC). “Through Roblox, Audree was exposed to emotional manipulation and social pressure by other users, including TCC members, who claimed to revere the Columbine shooters, depicted them as misunderstood outcasts who took revenge on their bullies, and encouraged violence against oneself and others,” the lawsuit said.

404 Media searched through Roblox’s game servers after the lawsuit was filed and found multiple instances of games named for the Columbine massacre. One server used pictures from Parkland, Florida and another was advertised using the CCTV picture of Dylan Klebold and Eric Harris from the Columbine shooting.


a16z-Backed Startup Sells Thousands of ‘Synthetic Influencers’ to Manipulate Social Media as a Service


A new startup backed by one of the biggest venture capital firms in Silicon Valley, Andreessen Horowitz (a16z), is building a service that allows clients to “orchestrate actions on thousands of social accounts through both bulk content creation and deployment.” Essentially, the startup, called Doublespeed, is pitching an astroturfing AI-powered bot service, which is in clear violation of policies for all major social media platforms.

“Our deployment layer mimics natural user interaction on physical devices to get our content to appear human to the algorithims [sic],” the company’s site says. Doublespeed did not respond to a request for comment, so we don’t know exactly how its service works, but the company appears to be pitching a service designed to circumvent many of the methods social media platforms use to detect inauthentic behavior. It uses AI to generate social media accounts and posts, with a human doing 5 percent of “touch up” work at the end of the process.

On a podcast earlier this month, Doublespeed cofounder Zuhair Lakhani said that the company uses a “phone farm” to run AI-generated accounts on TikTok. So-called “click farms” often use hundreds of mobile phones to fake online engagement of reviews for the same reason. Lakhani said one Doublespeed client generated 4.7 million views in less than four weeks with just 15 of its AI-generated accounts.

“Our system analyzes what works to make the content smarter over time. The best performing content becomes the training data for what comes next,” Doublespeed’s site says. Doublespeed also says its service can create slightly different variations of the same video, saying “1 video, 100 ways.”

“Winners get cloned, not repeated. Take proven content and spawn variation. Different hooks, formats, lengths. Each unique enough to avoid suppression,” the site says.
One of Doublespeed's AI influencers
Doublespeed allows clients to use its dashboard for between $1,500 and $7,500 a month, with more expensive plans allowing them to generate more posts. At the $7,500 price, users can generate 3,000 posts a month.

The dashboard I was able to access for free shows users can generate videos and “carousels,” which is a slideshow of images that are commonly posted to Instagram and TikTok. The “Carousel” tab appears to show sample posts for different themes. One, called “Girl Selfcare” shows images of women traveling and eating at restaurants. Another, called “Christian Truths/Advice” shows images of women who don’t show their face and text that says things like “before you vent to your friend, have you spoken to the Holy Spirit? AHHHHHHHHH”

On the company’s official Discord, one Doublespeed staff member explained that the accounts the company deploys are “warmed up” on both iOS and Android, meaning the accounts have been at least slightly used, in order to make it seem like they are not bots or brand new accounts. Doublespeed cofounder Zuhair Lakhani also said on the Discord that users can target their posts to specific cities and that the service currently only targets TikTok but that it has internal demos for Instagram and Reddit. Lakhani said Doublespeed doesn’t support “political efforts.”

A Reddit spokesperson told me that Doublespeed’s service would violate its terms of service. TikTok, Meta, and X did not respond to a request for comment.

Lakhani said Doublespeed has raised $1 million from a16z as part of its “Speedrun” accelerator program “a fast‐paced, 12-week startup program that guides founders through every critical stage of their growth.”

Marc Andreessen, after whom half of Andreessen Horowitz is named, also sits on Meta’s board of directors. Meta did not immediately respond to our question about one of its board members backing a company that blatantly aims to violate its policy on “authentic identity representation.”

What Doublespeed is offering is not that different than some of the AI generation tools Jason has covered that produce a lot of the AI-slop flooding social media already. It’s also similar, but a more blatant version of an app I covered last year which aimed to use social media manipulation to “shape reality.” The difference here is that it has backing from one of the biggest VC firms in the world.


Questa voce è stata modificata (1 giorno fa)