Necessary Always Active
Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.
|
||||||
|
||||||
|
||||||
|
The European Commission has said that the Code of Practice needed to enable tech companies comply with the EU’s AI Act could apply by the end of 2025. Tech companies, including America’s Meta and Google, and Europe’s ASML and Mistral, have called for a delay in EU AI Act implementation citing absence of the Code of Practice. According to Reuters. sections of the AI Act are to take effect in less than a month.
EU AI firms want the enforcement of the EU AI Act delayed for two years. The firms claim that implementation of the AI law will increase their compliance costs. Companies that make AI models have highlighted the tough requirements in the law as an area of concern. The companies claim that the Act could stifle innovation in the Europe Union where compliance teams in tech firms are much smaller compared to those in US firms.
Tech companies are also not sure on how to comply with the law in the absence of the Code of Practice, a document that is aimed at enabling AI developers to adhere to the Act.
“To address the uncertainty this situation is creating, we urge the Commission to propose a two-year ‘clock-stop’ on the AI Act before key obligations enter into force,” A group of 45 EU companies said in an open letter shared on July 2.
Originally, the publication of the Code of Practice for EU AI regulations had been scheduled to take place on May 2. With the initial timeline missed, the European Commission plans to publish the guiding document in the coming days.
The European Commission has hit back at tech firms calling for delayed implementation of EU’s AI Act.
“Our commitment to the goals of the AI Act, such as establishing harmonised risk-based rules across the EU and ensuring the safety of AI systems in the European market, remains unchanged,” Commission Spokesperson said.
A European lobby group has termed the call for delay a strategy by big techs to weaken the rules.
“Delay. Pause. Deregulate. That is big tech’s lobby playbook to fatally weaken rules that should protect us from biased and unfair AI systems,” Corporate Europe Observatory Campaigner Bram Vranken said.
The European Commission said it expects tech companies to sign the code in the coming month and that the code is expected to take effect by the end of the year.
“On the AI Act’s GPAI rules, the European AI Board is discussing the timing to implement the Code of Practice, with the end of 2025 being considered,” the Commission Spokesperson said.
Although the Commission says that signing of the code will be voluntary, tech companies that choose not to sign it would not have access to the legal certainty like signatories would. Some big techs have already indicated that they will not sign the code.
The EU AI Act was passed following intense discussions between countries in the region. Provisions of this law were to take effect in over several years, with implementation of some like those relating to general purpose AI (GPAI), set to commence on August 2 this year.
GPAI includes foundational models such as those developed by OpenAI, Mistral, and Google. Under GPAI obligations 2025, these models would be subjected to transparency requirements that include complying with copyright law in the EU, developing technical documentation, and detailing summaries of the content used to train the models.
The Act also requires AI firms to test their models for robustness, toxicity, and bias before unveiling them to the public. AI models that are categorized as having a high-impact GPAI or posing systemic risk will be required to conduct evaluations, and risk assessments and mitigation.
They will also be expected to carry out adversarial testing and also report serious incidents to the European Commission. Some EU political leaders have termed EU’s tech rules as confusing and urged the EU to pause their implementation.