The use of artificial intelligence in the publishing industry
INSIGHTS
Artificial intelligence (AI) has had, and is continuing to have, a significant impact in the publishing industry.
According to Nic Newman’s 2023 report for the Reuters Institute, media companies are quietly integrating AI into their products as a means of delivering a more personalised experience. For 28% this has become a regular part of their business, and an additional 39% are conducting experiments in this area.
However, implementing AI is not without controversy with ongoing debate on artistic integrity, the protection of intellectual property rights, and the potential replacement of human employees.
AI technology can be used across the industry for various tasks, including:
- Sales / Marketing: to provide insight into target audiences, trends, and demand which can generate content that is naturally worded and can create snippets tailored for the target audience on a specific platform such as Instagram or Twitter, enabling the delivery of a personalised experience.
- Content acquisition: to assist in market research, recommend authors or creators, and repurpose existing content for different platforms.
- Content creation: to support proofreading, plagiarism checking, editing, formatting, and grammar checking.
- Translations: to improve accuracy and speed in translations, reducing human error.
Examples of use come from several major publishers. For instance, Bloomberg News uses the system Cyborg to assist reporters with articles on company earnings reports each quarter. Additionally, Forbes now uses a Content Management System (CMS) called Bertie which can produce rough drafts and suggest appropriate headlines and images. The Washington Post’s Heliograf and ModBot even secured first place in a Global BIGGIES Awards, which reward media companies’ best practices in data and artificial intelligence products, projects, and strategies. This technology uses automated storytelling to implement hyperlocal coverage of matters such as high school football games and can publish on average 850 additional stories per year.
The challenges associated with the use of AI technology in publishing:
In terms of artistic integrity, there are concerns about the market becoming inundated with AI-generated art, removing human emotion from art, and even potentially appropriating human artists’ work. For example, Bloomsbury Books faced criticism for using AI-generated art on the cover of a novel by Sarah J. Maas, with accusations that they did it to avoid making proper payment to real illustrators.
The use of AI technology raises intellectual property (IP) concerns as the generated results may resemble and be based upon existing content in which third parties have rights, resulting in accusations of plagiarism or copyright infringement. For instance, when machine learning is applied to text and data mining (TDM), a technology that derives information from machine-read material, it can result in unauthorised use of copyrighted material.
The Bradford Literature Festival faced backlash for using an AI-generated image in their promotional materials, sparking debates about the infringement of creators’ work. Several authors expressed their concerns, including Lizzie Huxley-Jones who tweeted, “Respectfully, I think this is a huge disservice to creators. AI harms illustrators and writers too. I think as bastions of the creative industry, you should be investing in artists’ unique work, not further investing in and legitimising a practice that seeks to retire us.” And Emma Reynolds, an author and illustrator, also criticised the use of AI, stating, “There’s currently no ethical way to engage in AI as it scrapes millions of images without people’s consent, which is hours of unpaid labour. It’s infringement and it sends a bad message.” The festival subsequently issued an apology, withdrew from use AI-generated images, and acknowledged the need for clearer guidance.
Another potential problem arises because due to the extensive databases AI algorithms are trained on, there is a presence of stereotypical images and words embedded in the data and the outputs have the potential to amplify biases present in their data. The algorithms’ interpretations of gender and race can reinforce existing stereotypes.
While AI technology such as ChatGPT can generate content quickly, they often produce inaccurate material, which could threaten the trustworthiness of journalism. The accuracy of an AI-driven model’s output depends on the quality of the training data it was provided and chatbots may not understand the context of user requests. The Guardian reported that they received an email claiming to refer to an article written by one of their journalists in the past. Despite attempting to locate this article in their archives, they were surprised to discover that the reporter had never written it. It turned out that ChatGPT had invented this article, and it appeared convincing even to the journalist in question.
Overcoming these challenges:
Despite these challenges, leveraging AI technology can still offer significant advantages. Publishers and journalists can save time and resources in creating high quality content. A report into the impact of AI on the publishing industry by Gould Finch and Frankfurter Buchmesse predicts that AI will not replace writers but strengthen publishers’ core-business through implementation of new processes in marketing, analytics, productions, and administration. Additionally, they found investing in AI would not result in fewer jobs for humans, citing evidence from the Washington Post to smaller publishing houses, that they have witnessed positive effects on readership statistics and sales, but also better job stability for journalists and writers.
It has been said that the content AI creates is the content that humans hate writing. Kevin Roose, who used to recap all of those corporate earnings reports now addressed by Cyborg, described that work as “a miserable early-morning task that consisted of pulling numbers off a press release, copying them into a pre-written outline, affixing a headline, and publishing as quickly as possible so that traders would know whether to buy or sell. The stories were inevitably excruciatingly dull, and frankly, a robot probably could have done better with them”. Likewise, although the Washington Post most likely would not prioritise local high school football game coverage or assign a journalist specifically for this task, that doesn’t diminish the fact there is an audience who value such reporting, which is where use of Heliograf is beneficial to the local community.
By automating certain editorial tasks, particularly those that are perceived as tedious, AI enables publishers to allocate more time and resources to more important tasks. Lisa Gibbs, the director of news partnerships for the A.P, emphasised this, stating, “The work of journalism is inherently creative, driven by curiosity, storytelling, investigative reporting, holding governments accountable, critical thinking, and sound judgment. It is in these areas that we want our journalists to invest their energy.”
Legal regulation
In 2023, several legal disputes have arisen concerning generative AI companies, including Stable Diffusion, Midjourney, and Open AI. These companies are being accused of copyright infringement through using third-party material without permission to train their AI models.
Currently, the Copyright, Designs, and Patents Act 1998 provides limited exceptions regarding use of TDM for copyrighted works, it being permitted for non-commercial purposes, and via agreed access through agreements such as licenses, subscriptions, or terms and conditions.
However, in March 2023, the UK government published the AI White Paper setting out the government’s intentions for the regulation of AI systems. The main message of the paper was the government’s stated aim to eliminate barriers hindering AI and innovation development. In relation to intellectual property, the government aligned with Sir Patrick Vallance’s Pro-Innovation Regulation of Technologies Review (PIRT), which emphasised the importance of providing clear guidance and regulatory clarity for AI companies.
According to Sir Patrick Vallance “If the government’s aim is to promote an innovative AI industry in the UK, it should enable mining of available data, text, and images (the input) and utilise existing protections of copyright and IP law on the output of AI. There is an urgent need to prioritise practical solutions to the barriers faced by AI firms in accessing copyright and database materials,” and “to increase confidence and accessibility of protection to copyright holders of their content as permitted by law, we recommend that the government requires the IPO [Intellectual Property Office] to provide clearer guidance to AI firms as to their legal responsibilities, to coordinate intelligence on systematic copyright infringement by AI, and to encourage the development of AI tools to help enforce IP rights.”
The government responded to the PIRT recommendations by proposing that the Intellectual Property Office (IPO) develop a code of practice by summer 2023. This code would guide AI companies on accessing copyright-protected works as input for their technology. It would also ensure that appropriate protections, such as labelling, are in place for the generated output to support the rights of copyright holders.
In response to the White Paper, the Society of Authors (SoA) and members of the Creators Rights Alliance are working on their own paper. Their current stance is that AI serves as a valuable tool for human progress, and humans must be rewarded for their contributions. They advocate for the need for regulation, clarity, and labelling of AI-generated works. Additionally, they urge publishers and other creative businesses to state clearly their approach to using machine-generated or machine-assisted works and to protect creators’ livelihoods in an industry that is nothing without human experience.
While investing in AI technology can enhance efficiency, it is not without challenges. The legal position surrounding AI and intellectual property is complex. To discuss how AI-driven applications may impact upon your business, and to determine how you should approach meeting your legal obligations, get in touch with a member of our intellectual property and technology team.
About the author
RELATED
Dispute resolution
UK Supreme Court issues landmark decision on assessing environmental harm of extracting fossil fuels
Local government
Spring SOLAR Conference 2024 – a summary
Brand management
Take two: The law behind artists re-recording their music
Commercial & Regulatory
Copyright protection beyond the author's lifetime
Information law
The right to know: An RSL's relationship with Freedom of Information
Intellectual property & technology
Regulating artificial intelligence: The EU AI Act
Intellectual property & technology
Dua Lipa’s Levitation Litigation
Intellectual property & technology
Artists leading the fightback against the proliferation of AI
CONTACT US
Call us for free on 0330 912 0294 or complete our online form below for legal advice or to arrange a call back.