Kentucky Sues Roblox Over Child Safety Concerns
- Kentucky’s attorney general filed a lawsuit against the gaming platform, claiming inadequate protections for young users amid growing scrutiny of the company’s safety measures.
Platform Under Fire for Safety Measures
Kentucky Attorney General Russell Coleman filed a lawsuit Monday in state court against Roblox, alleging the gaming platform has inadequate child safety protections. The legal action claims the platform, which reports 111 million daily active users, lacks effective age verification systems and content filters. Coleman characterized the site as having become a “playground for predators” during a Tuesday news conference. The lawsuit represents the latest in a series of legal challenges facing the company over child safety concerns.
The complaint alleges that children on the platform encounter inappropriate content including violent or sexual situations. Parents have reported instances of strangers contacting their children through third-party chat applications that appear integrated with the game. Kentucky’s suit claims these security gaps violate the state’s Consumer Protection Act. The attorney general’s office is seeking penalties of up to $2,000 per violation and court-ordered compliance measures.
Growing Legal Challenges Across States
This lawsuit follows similar legal action in Louisiana filed in August 2024. Iowa also pursued litigation after a 13-year-old girl allegedly met an adult predator through the platform, leading to kidnapping and trafficking across state lines. The mounting legal pressure reflects broader concerns about online safety for minors on gaming platforms. Coleman indicated his office remains open to settlement negotiations with Roblox rather than seeking to shut down the platform entirely.
Courtney Norris, a Kentucky mother of three, spoke at the press conference about her experience with the platform. She initially viewed Roblox as a safe gaming option for her children, describing it as a “fenced-in backyard” for kids’ gaming. Her perspective changed after discovering what she called the “Wild West” nature of the platform’s environment. Norris stated that the platform’s design makes parental supervision “nearly impossible” despite its child-friendly appearance.
Company Defends Safety Protocols
Roblox disputed the allegations in a Tuesday statement, highlighting its existing safety infrastructure. The company stated it employs advanced AI models and maintains a moderation team of thousands working around the clock. According to Roblox, the platform implemented 100 new safety features in 2024 alone, including facial age estimation technology. The company emphasized that users under 13 cannot send direct messages outside of games unless parents modify default settings.
The platform’s current safety measures include text filters designed to block inappropriate language and attempts to redirect young users to external sites. Roblox prohibits user-to-user image sharing and sexual conversations according to company policies. The system also prevents sharing of personal information such as phone numbers or addresses through its filtering mechanisms. Despite these measures, the company acknowledged that “no system is perfect” and safety improvements remain ongoing.
Coleman emphasized that the goal isn’t to eliminate the platform but to ensure adequate protections for young users. Roblox expressed willingness to engage with Kentucky officials to demonstrate its safety efforts. The company maintains it shares the goal of protecting children online and welcomes discussions with the attorney general’s office. Both parties indicated potential openness to resolving the matter outside of prolonged litigation.
