REUTERS

Disclaimer: This asset – including all text, audio and imagery – is provided by The Conversation. Reuters Connect has not verified or endorsed the material, which is being made available to professional media customers to facilitate the free flow of global news and information.

SOURCE: THE CONVERSATION

 

by Senior Lecturer in Visual Communication & Digital Media, RMIT University, and Research Fellow, Technology, Communication and Policy Lab, RMIT University

OpenAI, the makers of ChatGPT, and News Corp, the international media conglomerate, have signed a deal that will let OpenAI use and learn from News Corp’s content.

In practical terms, this means when a user asks ChatGPT a question, the results might be informed by previous reporting in News Corp outlets, including Australian mastheads such as The Australian and The Daily Telegraph. It’s unclear whether the agreement includes only editorial or also opinion content.

OpenAI has licensed News Corp content because generative artificial intelligence (AI) is a ravenous beast: it needs data to learn from and generate useful outputs in return. Its ability to do this is impacted by the size and quality of its training data.

But could the media be signing its own death warrant by sharing its journalism? Or do we all benefit from the wider availability of reliable information?

ChatGPT, OpenAI’s major service, has learned from consuming books, articles and publicly available web content. This includes online news articles from across the internet.

However, there are unresolved questions over who owns the content. The New York Times, for example, is suing OpenAI over alleged copyright infringement. By inking deals with media companies, generative AI services like ChatGPT can ensure they stay clear of legal questions by paying to learn from their content.

The quality and provenance of the training data also matter and can lead to biases in what generative AI produces. So it is notable that while some news media organizations are trying to stop their content from being used, others, including Associated Press, are signing deals.

ChatGPT is a complex technical system. Just because some outlets opt in to licensing deals and others don’t won’t mean the technology will sound more like The Australian than The New York Times.

However, at a broader level, where ChatGPT gets its news content from may affect how it responds to questions about current events.

Working out what sort of news content gets included from each publication may also have an impact on how ChatGPT answers queries. Opinion articles are often more sensationalist than straight news, for example, and sometimes do not accurately reflect current issues.

It also remains to be seen how deals like these will affect the human labor of journalists and editors.

On one hand, since generative AI needs more and better content to provide better answers, journalists and content creators will be needed to ensure there is ongoing training data for AI to learn from.

On the other hand, it’s not clear how many journalists organizations like News Corp think are necessary to do that job as further cuts at the organization are expected next week.

At the same time, the ability for AI to “hallucinate,” or make things up, is well-known. The role of editors in fact-checking content, and critical thinking among those consuming content, is paramount.

In all this, small and medium-sized players in the media landscape seem once more to be pushed to the side, as the big players battle for lucrative content deals while smaller organizations fight for scraps or are left hungry.

These deals also raise questions about the role of ABC and SBS in a changing media environment. Australians pay for public service media through their taxes, but OpenAI is not rushing to do deals with these organizations.

However, companies like OpenAI are gradually accepting the principle that producing quality news costs money and that they need to secure licenses to use content. If they want to be consistent, there is strong case to be made that such companies should not just include public service media content in their models, but recompense these organizations in the process, much like Google and Meta organized deals with the ABC through the News Media Bargaining Code.

Where you get your news matters. More people may use AI services for news in the future, but right now it is an underwhelming source of reliable information. Signing content-sharing agreements with companies like News Corp may help improve the quality of answers and increase the relevance of ChatGPT outputs for Australian users.

News Corp also doesn’t have journalists in every community, so supporting independent media in your local area can help you get quality information and prevent news deserts from increasing.

At the end of the day, generative AI doesn’t always get it right (and often gets it wrong) so treat outputs with a healthy level of caution and compare results with those from reputable sources before using AI-generated content to make decisions.