Tech Policy Press - The AI Act and the Existential Risk Facing Journalism

 

CJL Director Dr. Courtney Radsch discusses the EU’s recently published AI Act in efforts to resuscitate and instill legal and ethical standards for AI.

The European Union’s Artificial Intelligence Act (AI Act), passed this month, is a pioneering effort to fill a legal regulatory void with ethical, safety and rights-based standards for the adoption of AI. But as much as it is welcome, it is insufficient to address both the immediate and existential risks these technologies pose — and the harms they are already causing.

The AI Act sets legally enforceable limits on the types of permissible AI systems, risk assessment and transparency requirements, and penalties for failure of compliance. As such, it marks an important step forward from the inadequate voluntary ethical and safety standards we’ve seen in various jurisdictions, such as the US, so far.

But failing to address the harms that have already been caused by the rapid proliferation of these technologies, from the theft of intellectual property (IP) and the rise of synthetic media-making tools to the impacts of unaccountable algorithmic decision-making and facial recognition systems, means these harms may be irreversible by the time the law comes into effect.

The Act has a tiered timeline for implementation and enforcement, meaning it won’t be fully applicable until two years after it goes into force. Corporations will have three years to meet all of the obligations for high-risk systems.

This extended timeline could prove fatal to achieving the Act's goals, especially when it comes to protecting democracy. The European Union and at least 64 countries representing nearly half of the world’s population are holding elections this year amid a proliferation of AI-propelled mis- and disinformation, massive closures of news outlets and layoffs of journalists around the world. The consequences of this will already be felt by the time the Act goes into full force.

Read full article here.