95% of Students Now Use AI — But Are Universities Keeping Up?

Dr. Max Lempriere
Read in 2 minutes

Every chapter of your thesis, mapped onto a single page.

I asked 250 PhD examiners how they'd structure a thesis if they were starting again. Their answers fit on a single page. Download it free — and stop staring at a blank document wondering where to begin.

You keep coming back. There's a reason for that.

Come write with us live. Join the next Monday Focus Session — 9am UK time this Monday.

A new report from HEPI (the Higher Education Policy Institute) has found that AI use among UK undergraduates is now almost universal. 95% of students report using AI in at least one way, and 94% say they use it to help with assessed work.

But here’s the finding that stopped me: 15% of students are using AI for friendship, companionship, or to tackle loneliness.

That statistic tells you everything about the state of higher education right now.

The headline numbers

The Student Generative AI Survey 2026 (HEPI Report 199) surveyed 1,054 full-time UK undergraduates in December 2025. It’s the third annual edition of the survey, and the shift from 2024 to 2026 is staggering:

  • AI use has gone from 66% (2024) to 95% (2026) — near universal in just two years
  • 12% of students now include AI-generated text directly in assessed work — up from 3% in 2024
  • 65% say assessment has changed significantly in response to AI
  • 49% say AI has improved their student experience — but 16% say it’s made it worse
  • 68% believe AI skills are essential for thriving in today’s world
  • Only 36% feel encouraged by their institution to use AI
  • Only 48% feel their teaching staff are helping them develop AI skills

The loneliness finding matters most

The survey found that around 40% of students say AI affects their loneliness — with positive and negative impacts reported in roughly equal measure. 20% say AI makes them feel lonelier. 21% say it makes them feel less lonely.

And 15% are actively using AI for companionship, advice, or to address loneliness.

For anyone working with doctoral researchers, this should ring alarm bells. If undergraduates — who are typically on campus, in seminar groups, surrounded by peers — are turning to AI for company, imagine the picture for PhD students. Doctoral researchers who work alone, often from home, sometimes for years at a stretch. The isolation problem in doctoral study was already well-documented. AI is now part of the coping mechanism.

This isn’t a technology story. It’s a belonging story.

The polarisation problem

One of the most striking findings is how polarised the student body has become. The report highlights several near-equal splits:

  • 37% agree their institution encourages AI use vs 36% who disagree
  • 20% say AI makes them lonelier vs 21% who say less lonely
  • 34% lean towards traditional sources vs 37% who favour AI sources

And two student quotes from the report capture this perfectly:

“AI tools allowed me to quickly summarise dense readings and generate drafts or outlines for assignments, saving hours of tedious work and letting me focus on critical analysis and deeper understanding.”

And:

“I’m not using my brain at all.”

Same technology. Same generation. Completely different outcomes. The difference isn’t the tool — it’s the support around it.

What this means for PhD students

The HEPI survey focuses on undergraduates, but the implications for doctoral researchers are significant:

The isolation angle is amplified. If 15% of undergraduates are using AI for companionship, the figure among PhD students — who face far greater isolation — could be higher. This isn’t necessarily a bad thing. AI can be a useful thinking partner. But it shouldn’t be a substitute for human connection, peer support, and the feeling that someone understands what you’re going through.

The skills gap is real. Only 48% of undergraduates feel their staff are helping them develop AI skills. For PhD students, who often receive even less structured teaching support, the gap is likely wider. Learning to use AI critically — as a research tool, not a crutch — is a skill that needs to be taught, not assumed.

The anxiety about misconduct is relevant. The report notes that students are increasingly anxious about being falsely accused of cheating. For PhD students, where academic integrity is existential, this anxiety is likely even more acute. Clear guidance from supervisors and institutions matters.

The institutional gap

Perhaps the most damning finding: while 95% of students use AI, only 36% feel their institution encourages it. Only 38% say their institution provides AI tools. And Arts and Humanities students — the group most represented in our own community — feel particularly under-supported.

The report’s recommendations are sensible: structured AI induction, curriculum-embedded AI literacy, clear assessment guidance, accessible tools, and staff training. But recommendations are easy to write and hard to implement, especially in institutions that are still debating whether AI is a threat or an opportunity.

Where we stand

We’re not going to pretend we have all the answers on AI. Nobody does. But here’s the thing: the students who thrive with AI are the ones who have support around them. Someone to think with. Someone to challenge their ideas. A peer community that makes them feel less alone in the process.

The reality is that kind of support — the kind of support your supervisor and department should be providing — rarely materialises on its own. Institutions are slow. Guidance is patchy. And you’re left trying to figure out not just how to use a rapidly changing set of tools, but whether you’re using them in ways that are rigorous, ethical, and actually serving your research.

And the writing still has to be yours. Students who want to write a PhD literature review well still need to think through the sources themselves, not outsource the judgement to a tool.

That’s a lot to work out in isolation. And you shouldn’t have to.

The PhD Common Room exists precisely for this. It’s a peer community of doctoral researchers working through exactly the same questions you are — about AI, about writing, about whether any of this is going as it should. Not being alone in that changes things. Members pay between £30 and £75 a month, depending on what they can afford, and what they get in return is the thing the institution was supposed to give them: people who get it, structure that helps, and the quiet reassurance that the difficulty is real and the struggle is shared.

The technology changes. The need for belonging doesn’t. So — are you working through any of this on your own right now?


The full HEPI report is available here. The Student Generative AI Survey 2026 was authored by Rose Stephenson and Charlotte Armstrong and sponsored by Kortext.

What kind of PhD researcher are you?

Learn what’s actually making your PhD hard — and what to do about it.

This free assessment takes four minutes and involves twelve questions. Here's what you'll get:

  • Your doctoral profile — personalised to your answers
  • A personalised PDF report with a clear explanation of what's making your PhD hard
  • Specific recommendations based on where you actually are

Comments

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *