Become a member

News and Advocacy

AI guidelines for creators

The emergence of AI tools presents both opportunities and risks to authors’ and illustrators’ professional practice, so we’ve prepared some guidelines to help you try to safeguard your work and protect your interests. We will review this guidance as new challenges and opportunities emerge.

 

Generative artificial intelligence is rapidly evolving and is already having an impact on the book industry: AI-generated spam books under high profile authors’ names are being sold online, some publishers are using AI to generate jacket images, and publishing service providers and distributors are introducing terms in their contracts which allow for use of work in training AI.

Please note that the advice we are providing below is general guidance only. If you are seeking specific advice relating to your circumstances, please contact us via our Member Advice Service, or for legal advice, via Authors Legal.

Overview

Generative AI tools such as ChatGPT and Midjourney use machine learning to generate written work or images based upon a user’s inputs, at the click of a button. These tools rely on training datasets to generate text and images – training datasets which include books, journals, essays, images and articles all ingested from the internet without permission from, or compensation to, creators.

General guidelines

We have made it clear to Government that we consider the large-scale scraping and exploitation of works without permission to be copyright infringement, and have requested that transparency over training datasets be made mandatory, similar to the approach taken in the EU. However, regulatory intervention will take time – in the interim, what can you do to protect your work?

The key piece of advice we can offer is to be vigilant. If you are signing a contract or agreeing to terms of service, be aware of what you are agreeing to. If you’ve already signed with a publisher, find out about their intentions when it comes to using AI for the production of your book to ensure you’re on the same page. If you plan to use generative AI, be aware that there is an ethical landscape to navigate and you should be satisfied that you understand the risks involved. Generative AI tools are built on the back of unremunerated appropriation of copyright works and, given the international outcry from creatives around the world, you should be aware that your use of these tools may harm your author or illustrator brand. 

Contracts

Given the uncertain provenance of generative AI and the legal and ethical concerns raised, authors and publishers ought to agree if and how generative AI tools might be used in the production of their books and whether authors’ copyright works may ever be used for AI training. This issue should be addressed in future contracts but also clarified in existing contracts. In rapidly evolving, uncertain times, and in light of two publishers (Wiley and Taylor & Francis) now entering into partnership agreements with AI companies, it is critical to seek transparency.

You might wish to ask the following questions of your publisher:

  • What are their intentions in respect of the use of your work to train AI models?
  • Does your publisher consider your current publishing agreement grants them the rights to sublicence your work for AI training already? Where in the contract is this captured?
  • Do your existing contractual revenue share entitlements offer you fair payment for these new uses? If not, why not?
  • Importantly, can you opt out of any use of your work for AI training? If not, why not?
  • In what way is your publisher already using AI? Is the AI proprietary?
  • Which publishing activities are you comfortable for your publisher to perform using generative AI, if any? This may well vary depending on the nature of your work and whether your work is scholarly or for trade. E.g. You might feel comfortable with their use of generative AI for generating drafts of marketing copy and social media posts for your book, but not for the audiobook narration, cover design, or any translations of your work.

 

One practical way to address the current uncertainty is to ask for any such use to be subject to your prior consent. We provide the following clauses as suggested drafting to assist these negotiations and to protect your position.

Model clauses

  1.  No Generative AI training or licensing without consent 
    This Agreement does not confer upon the Publisher any rights or interests other than those specified. Despite anything in this Agreement to the contrary, the Publisher is prohibited, without the Author’s prior written consent, from reproducing, publishing, communicating, or otherwise using the Work or any part of the Work to develop, train or direct, Generative Artificial Intelligence technology or models (“Generative AI”), including but not limited to the mining or scraping of text, images or data from the Work, whether undertaken by the Publisher or by third parties authorised, licensed or directed by the Publisher.


  2. No substantial generative AI use without consent 

    Neither party will, without the other party’s prior written consent, use or authorise others such as sub-licensees to use, Generative AI to wholly or substantially fulfil that party’s material obligations pursuant to this Agreement or any sub-license authorised by this Agreement.

    In particular, the Publisher must seek the Author’s prior written consent to the use of Generative AI in relation to their Work for [list and amend as appropriate for your deal]:

    a) audiobook narrations
    b) translations
    c) illustrations or artwork
    d) cover design
    e) animation
    f) substantial visual elements
    g) revisions and new editions
    h) substantive editing 

While it may not be possible to have this clause added in its entirety to your agreement, what’s vital is that you have a conversation with your publisher about their intentions so you can make an informed decision about signing any publishing agreement with them.

Please note, we’ll be updating the above model clause above as the artificial intelligence landscape evolves. For advice on specific drafting, or if you are asked to sign a contract with an AI clause that you don’t understand, please seek guidance from Authors Legal.

You must also understand that only humans can be authors of a copyright work. If your work is entirely generated by AI, it is not currently protected under Australian copyright law. This may have implications for submissions to publishers, grants, competitions, and awards. In submission processes to publishers, it is incumbent upon creators to be transparent about their use of AI to maintain a sense of trust both within the creative community and with their readers. More and more publishers are including in their author warranties an express contractual promise from the author that the manuscript has not been generated by AI.

Self-publishing authors

For self-publishing authors, the general principle should be applied; understand the terms you are agreeing to in your contracts with publishing service providers, aggregators, and distributors of your work.

Additionally, you may also want to include a ‘no AI training’ notice on the copyright or imprint page of your books. The US Authors Guild has provided an example of such a notice:

No AI Training: Without in any way limiting the author’s [and publisher’s] exclusive rights under copyright, any use of this publication to “train” generative artificial intelligence (AI) technologies to generate text is expressly prohibited. The author reserves all rights to license uses of this work for generative AI training and development of machine learning language models.

While this is not a foolproof measure, it is a clear statement of your rights. 

Web content

For all authors and illustrators publishing content on their website or blog they’d like to protect, you may be able to prevent unauthorised scraping of your content through restricting web crawlers. To do so you will need to add or update your robots.txt file – find out how to do this via Google.