An international team of researchers found that even brief reliance on artificial intelligence can undermine persistence and reduce people’s ability to solve problems on their own once the assistance is removed, according to The Independent. In other words, the shiny machine help that gets sold as convenience can leave people weaker when the tool is gone, with the burden falling on ordinary users who are expected to keep adapting to systems designed elsewhere.
Who Pays for the Convenience
In experiments involving mathematical reasoning and reading comprehension, participants who used AI for 10 minutes performed worse and gave up more often when the tool was taken away compared with peers who received no help, the team from the University of Oxford, MIT, UCLA, and Carnegie Mellon reported. That is the basic hierarchy at work here: a tool that promises efficiency, then quietly shifts the cost onto the person who has to think without it.
The researchers described these outcomes as reduced persistence and impaired unassisted performance, warning that short-term gains from AI can come at a heavy cognitive cost if they accumulate over time. Their concern is not framed as a dramatic collapse, but as a slow grind — the kind of erosion that happens when dependence becomes routine and the damage is spread out enough to look harmless.
The Slow Grind of Outsourcing Thought
The study’s authors cautioned that skills such as fraction arithmetic and reading comprehension may appear delegable to tools, but conceptual mastery of them underpins higher-order capabilities including algebra and critical reasoning. That matters because the system being built around AI does not just automate isolated tasks; it can also hollow out the foundations people need to reason independently.
Their concern centers on how regular outsourcing of effort could dull the motivation and stamina required for long-term learning, with small degradations compounding until they become difficult to reverse. The language of convenience masks a familiar structure: people are encouraged to hand over effort, then blamed when their own capacity weakens.
Grace Liu, a co-author from Carnegie Mellon University’s Machine Learning Department, said the issue is not that AI makes people less intelligent in a blunt sense, but that it can quietly strip away the difficulties that help build durable competence. “The concern is about what cognitive scientists call ‘desirable difficulties’ - the productive struggle that builds skill over time. If AI routinely removes that struggle, people may get the right answer in the moment, but develop less robust independent capability,” she said.
What the Researchers Actually Said
Liu added that the scale and scope of the effect still require investigation. “It’s not about AI making us ‘dumber’ - it’s more subtle than that. But how significant this effect is at scale, and across different contexts, needs more research,” she added. That leaves the public with the usual arrangement: a technology spreading fast, consequences still being measured, and ordinary people expected to absorb the risk while institutions study the fallout.
“It’s not a reason to avoid AI, but it is a reason to design and use these tools carefully,” Liu concluded. The warning lands as a modest plea for caution, but the study itself points to a deeper problem with systems that reward dependence: when effort is stripped away in the name of efficiency, the human capacity to persist, learn, and think without permission can start to thin out.