As Meta pushes the boundaries of creativity with its latest app, ‘Edits’, regulators across Europe and the UK are simultaneously tightening their scrutiny on how social media platforms protect young users. The tension between innovation and responsibility is more visible than ever and it signals a defining moment for the future of digital platforms.
‘Edits’, Meta’s new standalone app, is a clear play to strengthen its power in the creator economy. The app allows users to craft and edit short videos with AI-powered effects, animation tools, green screens, and advanced sound overlays, designed specifically to streamline content creation for Instagram Reels. It’s a move that not only feeds the growing appetite for visual storytelling but also hints at Meta’s battle with TikTok.
To better understand how ‘Edits’ performs compared to existing tools, we tested both Edits and CapCut by trying to create the same video in each app. While their designs are similar, Edits felt much easier and faster to use because it’s less crowded with features. Edits also allowed us to easily add real songs, had no watermark, and focused purely on quick, clean video creation. CapCut, on the other hand, offered more powerful AI tools and effects, but many features were locked behind a subscription and the app felt more complicated overall. In short: Edits is better suited for fast, simple videos, while CapCut is better for more advanced editing projects.
Yet the timing of ‘Edits’ could not be more delicate.
Just this week, Ofcom, the UK’s media regulator, unveiled its first set of regulations under the Online Safety Act, demanding that social media platforms “use robust age-checking measures to prevent children from accessing pornography, self-harm, or eating disorder content,” (The Guardian, April 24, 2025). Failure to comply could lead to eye-watering fines of up to £18 million or 10% of global revenue. These new obligations are part of a broader wave of policy shifts aimed at placing more responsibility on tech giants for the safety of their youngest users.
Adding fuel to the debate, Meta has simultaneously announced upgrades to its AI-powered age verification systems on Instagram. In a clear nod to mounting regulatory expectations, Meta stated it would start using artificial intelligence to better detect when users misrepresent their age, automatically applying stricter “teen account” settings when underage usage is suspected. As reported by The Verge, the platform is introducing “a system that can detect when a user may have lied about their birthday and apply protections accordingly” (The Verge, April 24, 2025).
This strategic pivot highlights an important truth: social media innovation can no longer be detached from regulatory compliance. While tools like ‘Edits’ allow brands, influencers, and everyday users to express themselves more creatively than ever before, companies like Meta are under mounting pressure to ensure that younger audiences are not just entertained — but also protected.