Skip to Main Content

Generative artificial intelligence: Use at University

Publishing and the use of artificial intelligence

The emergence of artificial intelligence writing tools, such as ChatGPT, has led to discussion about the inclusion of such tools as authors in scientific publications (Stokel-Walker, 2023). According to the authorship guidelines of the National Health and Medical Research Council (2019), an author needs to make 'a significant intellectual or scholarly contribution to research and its output', 'agree to be listed as an author' (p. 1) as well as be accountable for the accuracy and integrity of the research. 

Artificial intelligence tools are not yet truly 'intelligent' or sentient in the way we commonly understand this, regardless of the report of a Google engineer believing the large language model LaMDA passed the Turing Test and was therefore sentient (Oremus, 2022). These tools need to be accountable for research integrity, so many publishers are now stating in their policies or guidance that AI tools should not be listed as authors. The Committee on Publication Ethics (COPE) has also released a position statement that AI tools cannot be authors as they cannot take responsibility for the work, and are unable to declare conflicts of interest or manage copyrights and licensing (2023). 

However, like any other software or online tool, their use should be acknowledged. Some style guides are beginning to advise on how this acknowledgment should be formatted, such as Chicago, APA7, and MLA.

Before submitting for publication check the publisher's site for their policy and/or guidance on AI.