argument: Notizie/News - European Union Law
Source: Lexology
Lexology provides a detailed update on the latest obligations imposed by the EU AI Act, particularly focusing on “high-risk” AI systems. The article, authored by specialists from Hogan Lovells, outlines the legal responsibilities for developers and providers of these systems, including the need for conformity assessments, technical documentation, and post-market monitoring. The EU AI Act—set to be enforced progressively from 2025—defines high-risk applications based on their potential to affect fundamental rights, such as those used in critical infrastructure, employment, law enforcement, or education.
Companies must implement risk management systems, ensure data quality, and uphold transparency standards. The regulation will also require a clear assignment of roles among different actors in the AI supply chain, introducing both civil and administrative liabilities. The piece emphasizes that the Act will have extraterritorial impact, affecting non-EU companies placing AI systems on the EU market, making early legal compliance planning crucial.