Roblox CEO Dave Baszucki appeared on The New York Times’ Hard Fork podcast. He discussed the platform’s new age verification feature. However, he seemed frustrated by repeated questions on child safety.
Baszucki explained that the new feature requires users to submit a face scan to access Roblox messaging. The feature is part of Roblox’s push for safer online interactions and child protection in gaming.
When asked about a report claiming Roblox prioritized growth over safety, Baszucki responded with a hint of irritation: “Fun. Let’s keep going down this.”
Co-host Kevin Roose noted that improving AI models could enhance child safety. Baszucki replied, “Good, so you’re aligning with what we did. High-five.”
He added, “I came here because I love your podcast and came to talk about everything. If our PR people said, ‘Let’s talk about age-gating for an hour,’ I’m up for it, but I thought I came here to talk about everything.”
This interview highlights the growing tension around safety features in online gaming platforms. Roblox’s AI-driven moderation, age verification, and enhanced child-safety measures are trending topics in tech and gaming news.
FAQs
What is Roblox’s new age verification feature?
Users must submit a face scan to access messaging features. It aims to improve child safety and verify user age.
Why was Roblox CEO frustrated during the interview?
Baszucki grew tired of repeated questions focusing solely on child safety instead of broader topics.
How does Roblox plan to improve AI for child safety?
The platform is enhancing AI moderation to detect harmful content and risky behavior.
Does the new feature affect all Roblox users?
The age-verification feature applies only to those using messaging and related interactive functions.
Why is child safety a trending topic in gaming?
With millions of young users, platforms like Roblox are under pressure to prevent abuse, scams, and inappropriate content.



