An increasing number of academic publishers, conferences, and other stakeholders in academic publishing and scholarly communications are issuing guidelines on generative AI tools when it comes to authorship, conducting research, and manuscript submissions. Funding agencies are weighing in on generative AI being used for grant applications and peer review. And CANGARU, an initiative of international scholars, is working on a set of universal guidelines to inform best practices on AI use as an alternative or complementary approach to publishers' and other existing policies.
Find below an updating list of these resources (last updated: 2024, August 19).
'ChatGPT, Generative Artificial Intelligence and Natural Large Language Models for Accountable Reporting and Use Guidelines', short: CANGARU, is an initiative by a group of international researchers and other stakeholders in scholarly communications, aimed at developing a universal set of guidelines on the responsible use of generative AI and related technologies in academic research and publishing.
Resources released by the initiative:
Reporting about the initiative:
Vasquez, K. (2024). Researchers plan to release guidelines for the use of AI in publishing. Chemical & Engineering News, 102(2). https://cen.acs.org/policy/publishing/Researchers-plan-release-guidelines-use-of-AI-in-publishing/102/i2
American Chemical Society
Best Practices for Using AI When Writing Scientific Manuscripts (February 27, 2023) -- Please note: Not the official ACS guidelines
Open for Discussion: Chemistry or ChemistrAI? (October 1, 2023)
AIP Publishing
On the Use of AI Language Models in Scholarly Communications at AIP Publishing. (2023, February 10).
Cambridge University Press
AI Contributions to Research Content. (March 2023)
Includes the following note: "...individual journals may have more specific requirements or guidelines for upholding this policy."
Elsevier
Publishing Ethics for Editors – The Use of AI and AI-assisted Technologies in Scientific Writing. (March 2023).
FAQs – The Use of AI and AI-assisted Writing Technologies in Scientific Writing. (March 2023).
Nature
Tools such as ChatGPT threaten transparent science; here are our ground rules for their use. (2023, January 24).
SAGE
Using AI in peer review and publishing (2023)
Science
ChatGPT is fun, but not an author. Thorp, H. H. (2023, January 26).
Sciences Po
ChatGPT: Sciences Po Implements Rules and Opens up Discussion About AI in Higher Education. (2023, February 9).
Springer
Editorial Policy on Artificial Intelligence (AI)
Taylor & Francis
Taylor & Francis Clarifies the Responsible use of AI Tools in Academic Content Creation. (2023, February 17).
International Conference on Machine Learning
COPE: Committee on Publication Ethics
Authorship and AI tools. (2023, February 13).
Editors / Editorial Boards / Editorial Associations -- Committees
Group of editors’ of several bioethics and humanities journals
Statement on the Responsible Use of Generative AI Technologies in Scholarly Journal Publishing (October 1, 2023)
International Committee of Medical Journal Editors (ICMJE)
Recommendations for the Conduct, Reporting, Editing, and Publication of Scholarly Work in Medical Journals (updated May 23, 2023 to provide guidance on how work conducted with the assistance of AI technology should be acknowledged)
World Association of Medical Editors (WAME)
Recommendations on Chatbots and Generative Artificial Intelligence in Relation to Scholarly Publications (May 31, 2023)
Single Journals
Proceedings of the National Academy of Sciences of the United States of America (PNAS)
The PNAS Journals Outline Their Policies for ChatGPT and Generative AI (February 21, 2023)
arXiv
arXiv policy for authors’ use of generative AI language tools (2023, January 31; last edited February 7).
Funding Agencies
Australian Research Council
Policy on Use of Generative Artificial Intelligence in the ARC’s grants programs (July 7, 2023)
→ Advice on A.I. usage for grant applicants, chapter 3.1, page 2.
Cancer Research UK (CRUK)
Deutsche Forschungsgemeinschaft (DFG) = German Research Council
Research Funders Policy Group (UK)
Funders joint statement: use of generative AI tools in funding applications and assessment (19 September 2023)
Signatories to the statement include:
https://www.science.org/content/article/science-funding-agencies-say-no-using-ai-peer-review
National Institutes of Health - NIH (U.S.)
The Use of Generative Artificial Intelligence Technologies is Prohibited for the NIH Peer Review Process (23 June, 2023)
Australian Research Council
Policy on Use of Generative Artificial Intelligence in the ARC’s grants programs (July 7, 2023)
→ Advice on A.I. usage in peer review, chapter 3.2, page 3.
SAGE
Schmidt, C. (2024). Authorship, Ahoy! Mapping the uncharted waters of AI policies in scholarly communication. 2024. https://doi.org/10.5281/ZENODO.11237114