Over 1,000 UK university students have been surveyed by the Higher Education Policy Institute (HEPI) and 53% admitted that they also used popular artificial intelligence tools like ChatGPT or its countless imitators to create content, generate ideas, or both.
The Guardian The next part phrases perfectly, so I'm going to quote them: “Only 5% admitted to copying and pasting raw AI-generated text into their assessments.”
Okay, so math isn't the strongest point for anyone to come to a scary conclusion, but that's at least 50 students and almost definitely less than 100. True, this is the sample size of a study, but also is big. .
This is also not the first time that studies like this have been carried out and have caused rethinking about how to ensure Academic integrity in the age of AI. But if AI can trample all university degrees and courses, doesn't that mean they are no longer fit for purpose? Shouldn't educators adapt?
Adapt to AI in education
Well, they might be doing it. I'm reading a Wired item (paywall) from just over a year ago at the time of writing, and the understanding of AI's role in plagiarism is a bit depressing – a lot of ambiguity about 'hmm, if a computer generated the content, it's a plagiarism', and not really understanding that 'AI', as we know it in this context, is just a computer that has been force-fed a human-produced corpus (and often infringes copyright), no a literal sentient being.
But the educators cited in the Guardian article seem quite enthusiastic; try the following:
“I have implemented a policy of having mature conversations with students about generative AI. They share with me how they use it.” [Dr Andres Guadamuz, Intellectual Property Law Reader, University of Sussex] saying.
UK educators are also benefiting from the existence of AI. The Guardian writes that 58 secondary schools have been included in an Education Endowment Foundation (EEF) research project in which teachers will generate lesson plans using AI.
The report says nothing about how teachers are tackling it, but I think it is likely that this is the case, given that members and representatives of the UK's two main higher education unions, the University and College Union (UCU) and UNITE , have been Engaged in battle with the universities. about salaries and working conditions since I was a student, and it looks like it's about to start again. Anything to lighten the load.
All of this sounds much more compassionate than promoting AI as the Antichrist and threatening students with a stain on their academic record without any attempt to, say, educate students about what AI is or does.
That, at least, seems to be the general tone of that old Wired article, despite the anecdote from a real-life student talking about how poor ChatGPT is at producing engaging, let alone informed, academic material , so they wouldn't use it. that anyway.
Personal anecdote break
You could take me out of the magic circle here, but officially, at Future PLC, the parent company of TechRadar Pro, I am a graduate junior writer. The fact that I went to university, in a time before artificial intelligence, is basically the reason I can register industrial-strength opinions that make no discernible difference to the way things are.
I'm also a pretty solid opponent of generative “artificial intelligence.” Overall, it's a way to whitewash copyright infringement, dilute the work of individuals, and make things up as you go along to create a kind of tasty Swiss cheese prose. Bad actors (including, ahem, the HEPI studio) call this last “hallucination,” but I think I'm going to call it a “lie.”
Regarding the generation of written content, Future PLC investigate the use of AI and disciplines when discovering plagiarism. However, now I find myself in a strange situation of… I don't care about the use of AI? At least in the field of education.
I don't care if students use AI to get a degree
Tempting title, but it's not because I got paid dark money in the last thirty seconds to make me talk now about how AI is the future or whatever, it's because the net good of AI has shown that the education system and The way in which the world of work is perceived is broken.
We published a story this week about how most of the young people they have now are struggling to get work experience. I have personally faced this. Even getting this 'graduate' position was, I think, due more to my relevant work experience, for which I completely demoted myself, than to the paper I received from my university for my tens of thousands of pounds and my relentless work.
Reading it infuriated me and reminded me of the following maxims, as decreed by civilization.
All of this to say that the college degree has become so useless, and yet a prerequisite of modern working life, that not only do I not care about the more egregious uses of AI in higher education, but in fact , it makes me a little sad that the number of students who participate in that type of use are not greater.
Students' use of AI in assessments accuses college courses of being boring and overpriced for what they are, more than it accuses students of being hardened academic criminals.
Some students don't do well, learn differently, or are simply here because of course they need a degree to get a job. That was a “round hole in a square hole” scenario even when higher education was more accessible, but now institutions are putting students in the same situation while imposing more financial restrictions on them.
With this in mind, I would suggest:
a) just give the student the paper for God's sake so he can get on with his life.
b) start phasing out “you need a degree to work” as a culturally ingrained principle if you want people to work regardless of what you do
c) Revise the assessment process so that it addresses multiple learning styles and dares to be truly interesting, which would also thwart “the rise of artificial intelligence,” or whatever.
My experiences of how clearly unconcerned both employers and educators are with the content and structure of the degree leads me to believe that if I had been able to use AI in college, my life would not have changed in any significant way, apart from to greatly reduce the A large amount of cerebrospinal fluid has been squeezed out of me to get here.
AI, like everything that has become the zeitgeist at the behest of a nebulous, financially motivated actor, is a nightmare and a cesspool. However, the education system is also a nightmarish cesspool, and AI has helped reveal it.
In this particular scenario, educational AI does not need regulation, it simply does what it is supposed to do: regurgitate and fool you. If that's enough to qualify what college students do anyway (been there, was there, am there), and therefore short-circuits higher education as we know it, then AI, for once, isn't the problem, and the children might actually be fine.
There are viable solutions
To be constructive in offering more realistic solutions than “reverse decades of commercialization of higher education through legislation with more legislation,” I have some ideas.
Start by de-routinizing how assessments are delivered in favor of a broader variety of projects and focusing on course content and delivery methods so that students actually want to interact with the assessment material. I admit, however, that this would still require ministers, secretaries and university staff to diligently insist on shooting themselves in the foot to admit they are wrong.
This sounds combative, but I must be fair. One leading figure in higher education who makes a strong case in this regard is Professor Dilshad Sheikh, Deputy Vice-Chancellor and Dean of the Business School at Arden University.
She says Arden, a coeducational and online higher education institution, is taking steps from punishment to education when it comes to the use of AI.
“Arden University maintains that rather than punishing students for using such technology in all circumstances or trying to train teachers to notice the signs of AI-generated content, they should teach students how to use it to help improve their work and their processes. “The university is therefore exploring how best to integrate AI into learning, teaching and assessment strategies, recognizing that a positive pioneering approach to AI is most beneficial for students.”
“Many other universities are focusing on plagiarism and how AI chatbots give students the opportunity to cheat on assignments. However, the reality is that technology cannot replicate the understanding and application of knowledge in authentic assessments, which is how we design our courses. The truth of the matter is that times are changing, so how and what we teach should change too.”
“AI will continue to get smarter to make our lives easier. We are seeing more and more companies adopting this type of technology to enhance their growth, so why should we punish our students for using the same software that is used in the real world?
AI and the real world
This last point is quite interesting and I hadn't really considered it until now. AI is being introduced into the workplace as a productivity tool, but its risks are surely the same as in education, as Future PLC has seen.
It is true that I have made no secret that I do not use AI and that I have a rather gloomy view of the whole matter. But using AI responsibly (for suggestions, ideas, rather than content) and evangelizing that kind of use in a learning environment is perhaps making the best of a bad situation.
And, evidently, small but vital steps are being taken from all sides of the UK higher education system to educate and critically address the inadequacy of AI to produce excellent and insightful academic work, as well as to drive change. in the way careers are taught and therefore re-engage students.
It's a good sign that the student-university transaction, while still a mandatory transaction in many workplaces in this country right now, could be about to become more valuable to students, the people who benefit most from it.
And then… who knows? We could stop having to read about teachers getting angry in the national newspapers because not only can their assessments be approved by a computer, literally making it up as it goes along, but that students are disengaged enough to prefer that whole effort thing. themselves. With higher education in the state it is in, I still don't blame them.