No products in the cart.
EU Delays AI Act, Backs Nudify App Ban Until 2027
The EU pushes back AI Act deadlines to 2027 and proposes a ban on nudify apps, creating regulatory uncertainty for developers. Extended timelines affect startups and high-risk AI systems, while…
EU Delays AI Act, Puts Developers on Hold Until 2027
The European Parliament voted on March 26, 2026, to delay key parts of the EU AI Act. It also backed proposals to ban nudify apps. See Robert Hart for The Verge.
The measures, approved by a large majority, push back compliance deadlines for developers of high-risk AI systems until December 2027. Companies that develop AI systems covered by sector-specific safety rules, such as toys or medical devices, will have longer to comply. The proposed deadline is August 2028. Rules requiring providers to watermark AI-generated content will also be delayed until November 2026. All of these measures had originally been set to take effect this August.
The vote extends uncertainty for businesses operating in Europe. These businesses have already faced delays after the EU missed its own deadlines to publish key guidance and changed elements of the law. The EU also changed elements of the law. It is unclear whether the proposed changes can be implemented before the original August deadline. Parliament cannot unilaterally change European law. Parliament must now negotiate with the European Council over the final text. The Council is made up of ministers from all 27 member states.
Why the Delay Matters
The EU AI Act was originally hailed as the world’s first comprehensive AI regulation, setting a global precedent. The delays now mean that high-risk AI applications will continue to operate without stringent oversight. These applications include predictive policing tools and AI-driven medical diagnostics. This leaves regulators without enforcement powers and consumers without the protections that were slated to begin this year.
These businesses have already faced delays after the EU missed its own deadlines to publish key guidance and changed elements of the law.
Startups and scale-ups are among the most affected. Unlike Big Tech, smaller firms often rely on clear timelines to secure funding and plan product rollouts. The extended deadlines could lead to a chilling effect on innovation. Investors reassess the European market’s regulatory risk profile.
The Nudify App Ban: A New Frontier in AI Regulation
Members also backed proposals to include a ban on nudify apps in the revised AI Act. There are no details on what this might look like. It would not apply to AI systems with effective safety measures that prevent users from creating such images. The decision follows widespread outrage in the EU over the flood of Grok’s sexualized deepfakes on X earlier this year.
The ban targets apps that use AI to undress images of people without their consent. This practice has surged in recent years. These apps often operate under the guise of photo editing tools. They can also create non-consensual pornography. The EU’s move is part of a broader push to regulate AI systems that can harm individuals. This push focuses on women and minors.
You may also like
EducationHow Absolute Grading Will Impact Engineering Students
Anna University is moving to an absolute grading system for engineering colleges, affecting how students are evaluated. This change aims to reflect individual performance more…
Read More →However, the lack of clarity on how the ban will be implemented raises concerns among developers and civil liberties groups. The proposal does not specify what constitutes “effective safety measures,” leaving room for interpretation. This ambiguity could lead to inconsistent enforcement across member states. It undermines the effectiveness of the ban.
It would not apply to AI systems with effective safety measures that prevent users from creating such images.
Global Context: A Patchwork of Responses
The EU is not alone in grappling with the rise of nudify apps. In the U.S., several states have introduced legislation to criminalize deepfake pornography. There is no federal law. The UK’s Online Safety Act includes provisions against








