Open Markets Institute

View Original

The Center at for Journalism and Liberty at Open Markets Welcomes the Organization for Security and Co-operation in Europe (OSCE) Focus on Media Freedom and Monopoly  

WASHINGTON – The Center for Journalism and Liberty at Open Markets welcomes the launch of a new report on Big Tech and media freedom from the Organization for Security and Co-operation in Europe (OSCE) Representative on Freedom of the Media (RFoM), which connects press freedom with the monopolization of information systems and seeks structural solutions. CJL urges all affected stakeholders to provide input to the OSCE’s open consultation on media and Big Tech by the November 30th deadline.

Authored by CJL director Dr. Courtney C. Radsch, the Outcome Report of the Big Tech and media freedom expert workshop at the International Press Institute (IPI)’s 2024 World Congress includes actionable policy recommendations for the OSCE’s 57 member states.

These recommendations center on the most effective ways to address big tech’s market dominance and control over the global information environment and policy steps that can be taken by large States and small to rebalance market power and thereby enhance the availability and accessibility to reliable, diverse, and public interest information online.

The project is particularly urgent as big tech rolls out disruptive new generative AI products that have been trained on, use, and rely on journalism content without permission or compensation and further results in unsustainable business models for journalism that undermine media freedom at a rapid rate.

The report launch came as the RFoM and the Forum on Information and Democracy launched a call for contributions on media and Big Tech to further develop policy guidance for member states as part of its renewed Spotlight on AI and Freedom of Expression (SAIFE) project.

“[C]orporate gatekeepers control much of the modern public sphere by amplifying, banning, and manipulating the flow of information and ideas through their algorithmic intermediation and by frequently changing terms of service, and they can unilaterally shape the visibility and viability of journalism around the world,” the Outcome Report reads. “Ensuring healthy information spaces is imperative for democracy, peace and security, and leveraging technology to enhance the availability and accessibility of quality media and public interest information online through improved governance is crucial, especially as AI is rapidly adopted and integrated into the economy and information systems.”

Below are additional findings from the report on the scope of the challenge and some of the policy solutions workshop participants have recommended the new consultation take up:

  • Expanding the frameworks and tools used by States and civil society is essential to uphold media freedom and human rights, given that these rights are increasingly shaped by the way markets, data, and transnational technology are governed. This is particularly true for States with greater power to regulate big tech, such as the U.S., UK, and large European states, as their impacts are felt beyond national borders.

  • States should crack down on exploitative platform business models that risk further destabilizing the free press through the use of fair competition tools like unbundling, non-discrimination requirements, and the use of frameworks that rebalance power dynamics between media and tech platforms, such as fair compensation frameworks and media bargaining codes.

  • Investigative media, which, according to participants, should be an important partner for Big Tech given the wealth of high-quality information it provides, face particular challenges given the expense required to do this type of journalism and the difficulty in recouping an outlet’s investment. Establishing a framework for compensation for this kind of content (as well as others) is therefore particularly urgent.

  • Time and time again, publishers and journalists describe how a handful of tech intermediaries enjoy unaccountable power to shape the visibility and viability of media, having impacts that are not in line with human rights standards and obligations regarding the free flow of information.

Dr. Radsch, who authored the new report, has written extensively on the need to ensure human content creators are fairly compensated for their work used to train AI, including in her latest piece “AI Needs Us More Than We Need It,” for the November/December 2024 issue of The Washington Monthly.  She argued that the survival of artificial intelligence hinges on high-quality, human-generated content and data, which means that journalists, artists, content creators, and analysts, have more leverage to be fairly compensated for their work than they might realize. 

Earlier this year, Dr. Radsch published an expert brief for Open Markets, “A Framework for Establishing Journalism’s Value in Artificial Intelligence Systems,” examining how we can use frameworks for assigning value to the news content used by large language models to fairly compensate publishers and creators. Such frameworks and related principles for fair compensation have been widely accepted by policymakers, journalists, publishers, and other experts as a solution for fair compensation from the tech platforms that derive tremendous value from news content. 

Dr. Radsch also published an analysis for Brookings where she is a nonresident fellow, and in Nieman Reports. She has briefed policymakers, publishers, and other interested parties in the U.S., UK, Canada, Brazil, South Africa, and Europe. 

###