In recent years, a substantial number of university students in the UK have been identified for misusing ChatGPT and similar AI tools. While traditional forms of plagiarism appear to be declining significantly, a Guardian investigation reveals concerning trends.
The investigation into academic integrity violations has indicated a rise to 5.1 cases per 1,000 students, with nearly 7,000 verified instances of fraud involving AI tools reported between 2023 and 2024. This marks an increase from just 1.6 cases per 1,000 students in the previous academic year, 2022-23.
Experts anticipate these figures will increase further this year, estimating potential cases could reach around 7.5 per 1,000 students, although reported cases likely reflect only a fraction of the actual instances.
This data underscores the rapidly changing landscape for universities as they strive to update evaluation methods in response to emerging technologies like ChatGPT and other AI-driven writing tools.
Before the advent of generative AI in the 2019-20 academic year, plagiarism accounted for nearly two-thirds of all academic misconduct. Plagiarism rates surged during the pandemic as many assessments transitioned online. However, with advances in AI tools, the character of academic fraud has evolved.
Predictions suggest that for the current academic year, confirmed instances of traditional plagiarism could decrease from 19 per 15.2 to 15.2, falling to approximately 8.5 per 1,000 students.
The Guardian reached out to 155 universities via the Freedom of Information Act, which mandates disclosure of confirmed cases of academic misconduct, including plagiarism and AI-related fraud over the past five years. Out of these, 131 responded; however, not all universities had comprehensive records of annual or fraud categories.
More than 27% of responding institutions did not categorize AI misuse as a distinct form of fraud in 2023-24, indicating a lack of acknowledgment of the issue within the sector.
Numerous instances of AI-related fraud may go undetected. A survey by the Institute for Higher Education Policy revealed that 88% of students admitted to utilizing AI for evaluations. Additionally, last year, researchers at the University of Reading tested their rating system and found that AI-generated submissions went undetected 94% of the time.
Dr. Peter Scarf, an associate professor of psychology at the University of Reading and co-author of the research, noted that while methods of cheating have existed for a long time, the education sector must adapt to the challenges posed by AI, creating a fundamentally different issue.
He remarked, “I believe the reality we see reflects merely the tip of the iceberg. AI detection operates differently from traditional plagiarism checks, making it almost impossible to prove misuse. If an AI detector indicates AI usage, it’s challenging to counter that claim.”
“We cannot merely transition all student assessments to in-person formats. Simultaneously, the sector must recognize that students are employing AI even if it goes unreported or unnoticed.”
Students keen to avoid AI detection have numerous online resources at their disposal. The Guardian found various TikTok videos that promote AI paraphrasing and essay writing tools tailored for students, which can circumvent typical university AI detection systems by effectively “humanizing” text produced by ChatGPT.
Dr. Thomas Lancaster, a researcher of academic integrity at Imperial College London, stated, “It’s exceedingly challenging to substantiate claims of AI misuse among students who are adept at manipulating the generated content.”
Harvey*, who has just completed his Business Management degree at Northern University, shared with the Guardian that he utilized AI for brainstorming ideas and structuring tasks while also incorporating references, noting that many of his peers have similarly engaged with these technologies.
“When I started university, ChatGPT was already available, making its presence constant in my experience,” he explained. “I don’t believe many students use AI simply to replicate text. Most see it as a tool for generating ideas and inspiration. Any content I derive from it, I thoroughly rework in my style.”
“I know people who, after using AI, enhance and adapt the output through various methods to make it sound human-authored.”
Amelia*, who has just completed her first year in a music business program at a university in the southwest, also acknowledged using AI for summarization and brainstorming, highlighting the tool’s significant benefits for students with learning difficulties. “A friend of mine uses AI for structuring essays rather than relying solely on it to write or study, integrating her own viewpoints and conducting some research. She has dyslexia.”
Science and Technology Secretary Peter Kyle recently emphasized to the Guardian the importance of leveraging AI to “level the playing field” for children with dyslexia.
It appears that technology companies see students as a key demographic for their AI solutions. Google is now providing free upgrades to university students in the US and Canada for 15 months to its Gemini Tools.
Lancaster stated, “Assessment methods at the university level may feel meaningless to students, even if educators have valid reasons for their structure. Understanding the reasons behind specific tasks and engaging students in the assessment design process is crucial.”
“There are frequent discussions about the merits of increasing the number of examinations instead of written assessments, yet the value of retaining knowledge through memorization diminishes yearly. Emphasis should be on fostering communication skills and interpersonal abilities—elements that are not easily replicable by AI and crucial for success in the workplace.”
A government spokesperson stated that over £187 million has been invested in the national skills program, with guidelines issued on AI utilization within schools.
They affirmed: “Generative AI has immense potential to revolutionize education, presenting exciting prospects for growth during transitional periods. However, integrating AI into education, learning, and assessment necessitates careful consideration, and universities must determine how to harness its advantages while mitigating risks to prepare for future employment.”
*Name has been changed.
Source: www.theguardian.com