There has been no 'summer lull' in the debate over how AI will alter, improve, or destroy humanity; particularly, the challenge posed by AI to original creative content looms large.

In the US, the Screen Actors Guild and Writers Guild of America are striking over an absence of assurances from studios as to controls on the use and influence of AI content generation in the entertainment industry.

In the UK, the issue of whether an AI artwork generator could potentially infringe copyright in existing original creative content is currently being litigated in the High Court.

What is the threat posed by AI to copyright?

To generate new content, AI software first needs to 'learn' and to do this, and it requires access to vast quantities of original works (data, images, audio, text and video). For example, if it is asked to generate a 'Parisian romantic scene', AI software must learn, by searching, sorting and analysing millions of existing artworks.

In the UK, original artistic, literary, dramatic, or musical works, sound recordings, films and broadcasts are protected by copyright under the Copyright Designs and Patents Act 1988. Copyright arises automatically on creation of a work, and its author (subject to certain exceptions) will be the first owner of copyright in the work.

Copyright gives its owner the right to control the use and exploitation of their work, which includes the right to prevent the unauthorised copying of the work. The threat posed by AI to copyright is that the learning process undertaken by the AI systems typically involves the copying, storing and processing of original works, and without a licence (permission) from the copyright owner to do this, there is a real risk of AI systems infringing copyright in those original works.

Supporters of AI development, however, argue that the way in which AI systems learn does not infringe copyright, as it is no different from when a human artist learns by studying existing artworks and uses that knowledge to create a new 'original' piece of their own. It is also argued that the need to advance technological development in AI trumps the need to protect intellectual property rights in existing content.

What do the courts say about AI and copyright?

Earlier this year, Getty Images issued a claim in the High Court against Stability AI, alleging copyright infringement relating to the alleged unauthorised use of millions of Getty's images in the training of Stability's 'Stable Diffusion' AI image-generation software.

The Getty Images case is currently in its infancy and its outcome may not be known for some time; however, it is unlikely to be the only litigation on this issue. For example, it has also been reported that a major UK newspaper is preparing for a legal battle with Google, over what is alleged to have been the unauthorised use of thousands of online news stories to train Bard, its ChatGPT rival.

Whilst the current High Court litigation follows several similar cases brought in the US, this type of litigation (for now at least) may have to remain the preserve of large media giants such as Getty for several reasons:

  • The cost of bringing legal action to court for individual creators and artists can be prohibitive, and the damages (financial compensation) figures are often relatively low.
  • There are often inherent difficulties in starting and running 'class' or 'group' legal actions.
  • Such cases may be very difficult, or impossible, to prove because sometimes only imperceptibly small elements of original works are copied from a vast data set, and where only small elements are used, it may not even constitute an infringement of copyright, as the law requires proof of copying a 'substantial' element of an original protected work.

What does the Government say about AI and copyright?

The UK Government has a stated intention of being a worldwide leader in the adoption and development of AI technology; however it has not, to date, provided any proposed resolution to the threat that AI poses to copyright.

In fact, it has already faced resistance in its attempts to grant new rights to AI. For example, in 2022, a Government proposal to expand the current exceptions to copyright law, to allow the use by AI of text and data for all purposes, was rejected by the House of Lords Communications and Digital Committee. The Committee had concern for the implications on the creative industries in the UK, which contribute considerably to the UK economy: "Developing AI is important, but it should not be pursued at all costs."

Following the above, a report was produced earlier this year, as part of a 'Pro-Innovation Regulation of Technologies Review', which provided recommendations relating to generative AI and its relationship to intellectual property rights (such as copyright). The Government is now in the process of implementing those recommendations by producing a code of practice on copyright and AI, that is intended to provide guidance in how AI firms could legitimately access copyright protected works when training their AI software:

"The code of practice aims to make licences for data mining more available. It will help to overcome barriers that AI firms and users currently face and ensure there are protections for rights holders. This ensures that the UK copyright framework promotes and rewards investment in creativity."

What next?

The outcome from the Getty Images case may not be known for many months, but, it is hoped, the Government's code of conduct will provide some much-needed guidance and clarity.

Until then, AI developers would be advised to obtain a valid licence before making commercial use of original copyright-protected content, or risk legal uncertainty and potential infringement litigation.

Originally Published by 11 August 2023

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.