Roblox's $35.78 million settlement with three states highlights how inadequate moderation on gaming platforms creates opportunities for predators to exploit children through direct messaging and private communications. While platform-level reforms are important, the settlement underscores the critical need for real-time protection that monitors children's DMs across all platforms, not just reactive content policies. Guardii's AI technology addresses this vulnerability by detecting predatory patterns in direct messages before they reach children, providing the proactive 24/7 monitoring that platform moderation alone cannot deliver.