More than 1,000 British students have been researched by the Higher Education Policy Institute (HEPI) and 53% admitted to also using popular AI tools like ChatGPT or its countless imitators to create content, generate ideas, or both.
The guard says the following bit perfectly, so I’m going to quote them from it: “Only 5% admitted to copying and pasting raw AI-generated text into their reviews.”
Right, so math isn’t the forte of anyone jumping to a conclusion, but that’s at least 50 students and almost certainly less than 100. It’s true that this is the sample size of one study, but it’s also a large one. .
This is also not the first time that these types of studies have been carried out and led to rethinking how security can be arranged academic integrity in the age of AI. But if AI can trample all university degrees and courses, doesn’t that mean they are no longer fit for purpose? Shouldn’t teachers adapt?
Adapting to AI in education
Well, maybe they do. I read one Wired article (paywall) from just over a year ago at the time of writing, and the understanding of AI’s role in plagiarism is somewhat depressing: a lot of ambiguity about ‘hmm, if a computer generated the content then it’s plagiarism’, and not really understanding that “AI,” as we know it in this context, is just a computer that has been forcibly given a human-produced (and often itself copyright-infringing) corpus, and not a literal conscious being.
But the teachers quoted in the Guardian article seem quite excited. Try the following:
“I have implemented a policy to have mature conversations with students about generative AI. They share with me how they use it,” said Dr Andres Guadamuz, Reader in Intellectual Property Law, University of Sussex.
British teachers also benefit from the existence of AI. The Guardian writes that 58 secondary schools are involved in a research project by the Education Endowment Foundation (EEF), in which teachers will generate lesson plans using AI.
The report says nothing about how teachers will approach this, but I think it is likely that they will, given that members and representatives of the two main higher education unions in Britain, the University and College Union (UCU) and UNITE, embroiled in a battle with universities about wages and working conditions since I was a student, and it looks like it’s that time again. Anything to lighten the load.
This all sounds a lot more compassionate than extolling AI as the Antichrist and threatening students with a stain on their academic record without any attempt to educate students about what AI is or does.
At least that seems to be the overarching tone of that old Wired piece, despite the anecdote of an actual breathing student talking about how bad ChatGPT is at producing engaging, let alone informed, academic material so they don’t would use it anyway.
Personal anecdote break
I could be drummed out of the magic circle here, but officially with Future PLC, the parent company of Ny Breaking, I am a Graduate Junior Writer. In fact, the fact that I went to college, in a time before artificial intelligence, is the reason I can register industry strong opinions that make no discernible difference to the way things are.
I’m also a pretty staunch opponent of generative “artificial intelligence.” Overall, it’s a way to whitewash copyright infringement, dilute the work of individuals, and make things up as you go to create some kind of tasty Swiss cheese prose. Bad actors (including, um, the HEPI study) call this last one “hallucination,” but I think I’m going to call it “lying.”
When it comes to generating written content, Future PLC examines AI use and disciplines in detecting plagiarism. But now I’m in a strange situation where I don’t care about using AI? At least in the field of education.
I don’t care if students use AI to get a degree
An enticing headline, but just because I received a dark money payment in the last thirty seconds doesn’t make me whine about how AI is the future or whatever. It is because the net importance of AI has proven to break the education system and the way the perception of the working world.
We had a story about it this week how a majority of the young people they have now are struggling to gain work experience. I have personally encountered this. Even getting this ‘graduate’ role was, I believe, due more to my relevant work experience, which I absolutely humbled myself for, than to the actual piece of paper I got from my university for my tens of thousands of pounds and incessant toil.
Reading it infuriated me and reminded me of the following maxims as ordained by civilization.
All this to say that the college degree has become so worthless, yet such a prerequisite for modern working life, that not only do I not care about the most egregious uses of AI in higher education, but I’m actually somewhat I am saddened that the number of students who engage in that type of practice is not higher.
Students’ use of AI in assessments accuses university courses of being boring and too expensive for what they are, more than students being hardened academic criminals.
Some students don’t test well, learn differently, or are just here because of course you need a degree to get a job. That was a “round peg in a square hole” scenario even when higher education was more accessible, but now institutions are putting students in the same situation and also imposing more financial restrictions.
Given this, I would therefore suggest one of the following:
a) For God’s sake, just give the student the piece of paper so he or she can move on with their lives.
b) start phasing out ‘you need a degree to work’ as a culturally embedded principle if you want people to be working no matter what you do
c) Revise the assessment process in such a way that it caters to multiple learning styles and dares to actually be interesting, which would also work against ‘the rise of artificial intelligence’ or whatever.
My experiences with how clearly undisturbed both employers and teachers are in terms of the content and structure of their studies leads me to believe that, had I been able to use AI at university, my life would not have changed in any meaningful way, apart from vastly reducing the study costs. An enormous amount of spinal fluid was wrung out of me to get here.
AI, like everything that has entered the zeitgeist at the behest of a vague, financially motivated actor, is a nightmare and a cesspool. However, the education system is also a nightmarish cesspool, and AI has helped reveal that.
In this one specific scenario, educational AI doesn’t need any regulation, it just does what it’s supposed to do: regurgitate and bluff you. If that’s enough to qualify what students are already doing anyway (I’ve been there, it was, and it is), thus short-circuiting higher education as we know it, then for once AI isn’t the problem, and the children might actually be fine.
Workable solutions exist
To be constructive in offering solutions that are more realistic than “reversing the years of commercialization of higher education through more legislative legislation,” I have some ideas.
Start by taking the rot out of the way assessments are delivered in favor of a greater variety of projects, and focus on course content and delivery methods so that students actually want to engage with the assessment material. However, I admit that this still requires ministers, secretaries and university staff to dutifully insist on shooting themselves in the foot to admit they are wrong.
This sounds combative, but I have to be honest. A senior figure in higher education who makes a solid argument in this regard is Professor Dilshad Sheikh, Deputy Pro-Vice Chancellor and Dean of the Faculty of Business at Arden University.
She says Arden, a blended and online higher education institution, is moving from punishment to education when it comes to AI use.
“Arden University argues that rather than punishing students for using such technology in all circumstances or trying to train teachers to spot the signs of AI-generated content, they should teach students how to use it to improve their help improve work and processes. The University is therefore exploring how AI can best be integrated into learning, teaching and assessment strategies, recognizing that a positive breakthrough approach to AI is more beneficial for students.”
“Many other universities are focusing on plagiarism and how AI chatbots give students the opportunity to cheat on assignments. However, the reality is that technology cannot replicate the understanding and application of knowledge in authentic assessments the way we design our courses. The truth is that times are changing, so how and what we teach must change too.”
“AI will continue to get smarter to make our lives easier. We see more and more companies embracing such technology to fuel their growth, so why should we penalize our students for using the same software used in the real world?”
AI and the real world
This last point is quite interesting and one I hadn’t really thought about until now. AI is being laundered in the workplace as a productivity tool, but the pitfalls are certainly the same as in education, as Future PLC has seen.
Admittedly, I’ve made no secret of the fact that I don’t use AI and that I view the whole thing quite dimly. But by using AI responsibly – for prompts, for ideas, rather than for content – and promoting that kind of use in a learning environment, we may be making the best of a bad situation.
And it is clear that small but critically important steps are being taken from all sides in the UK higher education system to educate and critically engage with AI’s unsuitability to produce excellent, insightful academic work, and to push for change in the way degrees are taught and thus re-engage students.
It’s a good sign that the student-university transaction, while still one that is currently mandated by many workplaces in this country, is about to become more valuable to students, the people who benefit most from it. benefit from.
And then – who knows? Perhaps we no longer need to read in the national newspapers that teachers are angry that their assessments can not only be passed on by a computer that literally makes it up as it goes along, but that students are so detached that they prefer to apply it themselves. With higher education in the state it’s in, I still can’t blame them.