AI in Education: Students Are 'Outsourcing Their Thinking' and Test Scores Show It

95% of students now use AI for schoolwork. Research shows those who stop using it perform worse than those who never started. The skill gap is widening.

Students working at desks with laptops in a classroom setting

The numbers seem encouraging: 95% of UK students now use AI in some capacity. 86% of K-12 students in the US report using AI tools. ChatGPT use for schoolwork among teenagers has doubled in two years. AI has achieved near-universal adoption in education faster than any previous technology.

But the research coming in suggests we may be watching students trade short-term convenience for long-term capability.

The Cognitive Offloading Problem

The main concern isn’t cheating. It’s something more subtle: students are quietly handing over their thinking to technology instead of doing the mental work themselves.

Researchers call this “cognitive offloading” - using external tools to reduce the load on your own brain. In moderation, it’s fine. Nobody memorizes phone numbers anymore. The problem emerges when the offloaded tasks are exactly the ones students need to practice.

A study by Guilherme Lichand found something alarming: students who stopped using AI after becoming dependent on it performed worse than those who never used AI at all. The AI didn’t just substitute for their skills - it appears to have prevented those skills from developing in the first place.

One student interviewed by NPR described the realization simply: “I felt like I was outsourcing my thinking.” She stopped using ChatGPT entirely.

What the Surveys Show

The UK Higher Education Policy Institute’s 2026 survey paints a polarized picture:

  • 94% of students use generative AI to help with assessed work
  • 12% directly include AI-generated text in assignments (up from 3% in 2024)
  • 49% say AI has improved their student experience
  • But 65% say assessment has changed significantly because of AI

Students recognize AI skills matter - 68% view AI competency as essential for their futures. The gap is that only 48% feel their teachers are helping them develop those skills responsibly.

Faculty see it differently. A survey of over 1,000 faculty members found:

  • 90% believe AI will diminish students’ critical thinking skills
  • 95% expect increased student overreliance on AI
  • 78% report AI-driven cheating is rising
  • 71% say graduates lack understanding of AI ethics

The disconnect is stark. Students see a tool that saves time. Faculty see the development process being bypassed entirely.

The Detection Arms Race

Approximately 71% of K-12 teachers report struggling to determine whether student work is original. The tools meant to help - Turnitin’s AI detector, GPTZero, Copyleaks - are imperfect and generate false positives that can wrongly accuse students.

Most institutions have settled into a messy middle ground: software flags potential AI use, instructors investigate further, and policies vary wildly between departments and even individual professors.

Some professors have abandoned AI detection entirely, instead redesigning assignments to be AI-resistant. Others have embraced AI as a tool, requiring students to document their prompts and explain their process. Still others maintain blanket bans with varying degrees of enforcement.

One community college professor compared AI assistance to “bringing a forklift to the gym” when students need to develop writing skills through actual practice. Another professor at a different institution created an “African Diaspora and AI” course examining AI’s ethical implications while allowing responsible use for brainstorming and feedback.

There is no consensus.

NYC’s AI School Controversy

The tensions are playing out in real time in New York City, where a proposed AI-focused high school has sparked significant pushback.

Manhattan High Schools Superintendent Gary Beidleman proposed “Next Generation Technology High School” to open in District 2, utilizing Google’s AI-powered Skills Platform with curriculum development support from Google and OpenAI. Carnegie Mellon University would provide summer internship opportunities.

Parents objected on multiple fronts:

  • The proposal emerged with minimal community notice
  • It would replace an existing small school serving 91 students
  • NYC hasn’t released comprehensive AI guidelines for schools yet
  • Five Community Education Councils have passed resolutions calling for a two-year moratorium on AI use in schools

“There’s no playbook for how that will look,” one panel chair noted about the proposed AI curriculum.

The controversy captures the broader uncertainty: schools are being asked to prepare students for an AI-infused workforce while the educational framework for doing so responsibly doesn’t yet exist.

What’s Actually Working

Amid the concerns, some applications show genuine promise.

57% of special education teachers report using AI to develop individualized education programs (IEPs), adapting instruction to specific learning needs in ways that would take significantly more time manually.

Students who use AI strategically rather than as a replacement describe a different experience. One undergraduate told NPR she uses AI as “a study buddy for concept explanations and practice problems” but refuses to let it write her assignments. Another uses AI only for proofreading and rubric-checking, viewing her original writing as “like a fingerprint.”

The Philippines recently became one of the first countries to issue comprehensive guidelines for AI use in public schools, permitting tools like ChatGPT and Khanmigo while establishing ethical and pedagogical standards.

What This Means

The skill gap is already appearing. Students who’ve become dependent on AI tools and then lose access perform worse than if they’d never used AI at all. The brain develops through struggle, not shortcuts.

But this isn’t an argument for banning AI from education. That ship has sailed. The question is whether institutions can develop frameworks for responsible use faster than students can find ways around whatever rules exist.

Currently, the answer appears to be no. Only 35% of school districts have provided AI training to students. 68% of faculty say their institutions haven’t prepared them to teach with AI. The gap between adoption and governance is widening.

What Students Should Consider

If you’re in school and using AI:

Be honest about what you’re outsourcing. If AI is doing your thinking for you, those are skills you won’t develop. The convenience now may cost you capability later.

Distinguish assistance from replacement. Using AI to explain concepts, check your work, or brainstorm ideas is different from having it produce the work itself. Know which mode you’re in.

Consider the Lichand finding seriously. Students who stopped using AI after dependency performed worse than those who never used it. If you can’t do the work without AI, you may have a problem building.

Document your process. Whether required or not, knowing exactly how you used AI and what you contributed helps you stay honest with yourself about your own development.

The technology isn’t going away. But neither are the cognitive skills that AI can either supplement or supplant. The choice about which path you’re on is made in small decisions, every day, one assignment at a time.