Scientists are failing to disclose their use of AI despite journal mandates, finds study
![]() |
An analysis of more than 5.2 million papers in 5000 different journals has revealed a dramatic rise in the use of artificial intelligence (AI) tools in academic writing across all scientific disciplines, especially physics.
However, the analysis has revealed a large gap between the number of researchers who use AI and those who admit to doing so, even though most scientific journals have policies requiring disclosure of AI use.
Conducted by data scientist Yi Bu from Peking University and colleagues, the analysis examines papers listed in the OpenAlex dataset and published between 2021 and 2025.
To assess the impact of editorial guidelines introduced in response to the growing use of generative AI tools such as ChatGPT, they examined journal AI-writing policies, reviewed author disclosures, and used AI to determine whether papers were written with the help of technology.
The AI detection analysis reveals that the use of AI writing tools has increased dramatically across all scientific disciplines since 2023. It also finds that 70% of journals have adopted AI policies, which primarily require authors to disclose the use of AI-writing tools.
IOP Publishing, which publishes Physics World, for example, has a journals policy that supports authors who use AI in a “responsible and appropriate” manner. It encourages authors, however, to be “transparent about their use of any generative AI tools in either the research or the drafting of the manuscript”.
A new framework
But in the new study, a full-text analysis of 75 000 papers published since 2023, reveals that only 76 articles (about 0.1% of the total) explicitly disclosed the use of AI writing tools.
In addition, the study finds no significant difference in the use of AI between journals that have disclosure policies and those that do not, which suggests that disclosure requirements are being ignored – what the authors call a “transparency gap”.
The study also finds that researchers from non-English-speaking countries are more likely to rely on AI writing tools than native English speakers. Increases in the use of AI writing tools are particularly rapid in journals with high levels of open-access publishing.
The authors now call for a re-evaluation of ethical frameworks to foster responsible integration of AI in science. They state that prohibition or disclosure requirements are insufficient to regulate AI use, with their results showing that researchers are not complying with policies.
The authors argue that, instead of “opposition and resistance,” “proactive engagement and institutional innovation” are needed to ensure that AI technology truly enhances the value of science.
Michael Allen is a science writer based in the UK
FROM PHYSICSWORLD.COM 07-03-2026

Δεν υπάρχουν σχόλια:
Δημοσίευση σχολίου