Harvard Business Review published a piece this week that stopped me mid-scroll. The title: “AI Doesn’t Reduce Work—It Intensifies It.”
The research, led by Aruna Ranganathan and Xingqi Maggie Ye at UC Berkeley, tracked 200 employees at a U.S. tech company over eight months. The promise was familiar—AI handles the routine stuff so you can focus on higher-value work. But that’s not what happened. Workers ended up doing more. Faster. Across more hours of the day. Often without anyone asking them to.
The kicker? The company didn’t even mandate AI use. People adopted it on their own because it made “doing more” feel possible, accessible, and rewarding. And that’s exactly what makes it hard to see the problem coming.
I read the whole thing nodding. Because I’ve been watching this exact pattern play out in the schools I work with.
Three Things the Study Found
The researchers identified three ways AI intensifies work rather than lightening it.
First, task expansion. AI made it easy for people to take on work outside their usual role. Product managers started writing code. Researchers picked up engineering tasks. The boundaries of “my job” got blurry fast. Empowering, sure. But also a recipe for overload. And there were knock-on effects—engineers ended up spending more time reviewing and correcting AI-assisted work from colleagues who were “vibe-coding,” adding to already heavy workloads.
Second, blurred boundaries. Because prompting an AI feels casual—almost like texting—people slipped work into lunch breaks, evenings, the five minutes before leaving their desk. Some described sending a “quick last prompt” before stepping away so the AI could work while they were gone. The natural pauses in a workday started disappearing. Several workers only realized in hindsight that their downtime no longer felt like actual recovery.
Third, multitasking overload. Workers were running AI processes in the background while doing manual work at the same time. Multiple threads. Constant checking. A feeling of always juggling, even when it felt productive. Over time, this raised everyone’s expectations for speed—not through explicit demands, but through what became visible and normalized.
One engineer in the study put it plainly: “You had thought that maybe, oh, because you could be more productive with AI, then you save some time, you can work less. But then really, you don’t work less. You just work the same amount or even more.”
Now Think About Teachers

Read those three findings again and tell me they don’t describe what’s happening in schools right now. What I hear from educators consistently mirrors this research almost exactly.
Task expansion. Teachers who start using AI don’t just use it for the task they intended. They start generating differentiated materials for five reading levels instead of two. They build custom assessments. They create parent communication templates. They redesign lesson plans. Each thing AI makes possible becomes another thing they feel they should be doing.
Blurred boundaries. AI makes it easier to work at 9pm. So teachers do. They’re drafting tomorrow’s worksheet while watching TV. They’re tweaking a rubric on their phone during dinner. The barrier to doing “just one more thing” drops to almost zero. In our training sessions, teachers regularly admit that AI hasn’t given them their evenings back—it’s just changed what they’re doing during them.
Multitasking overload. A teacher has Gemini generating a reading passage in one tab, a quiz in another, and they’re manually reviewing student work at the same time. It feels efficient. It’s also exhausting. And the checking—making sure AI output is accurate, appropriate, aligned to standards—adds a layer of cognitive work that didn’t exist before.
The Assessment Trap

This past weekend I was facilitating an AI workshop at a school, and a teacher I was speaking with crystallized this perfectly. She’s already using AI extensively—a genuine early adopter. We were talking about AI in assessment, specifically how educators are using AI to build more robust assessments that also deter student misuse of AI.
She told me something that stuck with me: even though she’s using AI to help her review and build those assessments, the whole process still feels like a larger, more intensive time commitment than before. Because the assessments themselves have gotten more complex. More thoughtful. Harder to game. Which is a good thing—but the time savings she expected from AI got absorbed right back into the higher standard it helped her reach.
That’s the self-reinforcing cycle the HBR researchers describe. AI accelerates a task, which raises the bar for what “good” looks like, which creates more work to meet that new bar, which makes you more reliant on AI to keep up. The work doesn’t shrink. It just gets denser.
The Part No One Talks About
Here’s what concerns me most. In the corporate world, when AI intensifies work, there are at least structures in place—managers, HR, productivity reviews—that can theoretically catch it and course-correct. The HBR authors recommend organizations build what they call an “AI practice”: intentional norms and routines that structure how AI is used, when it’s appropriate to stop, and how work should and shouldn’t expand in response to new capability.
Schools don’t have that. Not yet.
Most teachers using AI right now are figuring it out on their own. There’s no “AI practice” framework at their school. No one is tracking whether the tool is saving time or just filling it with more tasks. No one is asking whether the quality of their work-life balance has improved or eroded.
And teachers were already burning out before AI showed up.
That’s a big part of what we’re building at UnconstrainED—helping schools develop exactly these kinds of structures before the burnout sets in, not after.
What Schools Can Do

The researchers propose three practices for organizations adopting AI. In our work with schools, we’re already seeing versions of these take shape—and they work.
Intentional pauses. Before acting on what AI generates, take a moment to ask: Does this actually align with what my students need? Or am I doing this because AI made it easy? In the corporate study, they recommend requiring a counterargument before finalizing a major decision. In schools, that might look like a teacher asking themselves, “Would I have done this without AI? And if not, is it actually worth doing?” We build this kind of reflective practice into our training because the habit of pausing is just as important as the habit of prompting.
Sequencing. Instead of reacting to every AI output the moment it appears, batch the work. Set specific times for AI-assisted planning. Protect blocks of time for teaching, thinking, and being with students without the pull of another tab generating another resource. The researchers found that pacing work in coherent phases—rather than constant background activity—reduced fragmentation and preserved attention. In the schools we work with, we help teams identify where AI fits into their existing workflow rhythms rather than letting it create new ones.
Human grounding. This one matters most in schools. AI enables more solo, self-contained work. That’s fine for some tasks. But teaching is fundamentally relational. The HBR authors point out that AI provides a single, synthesized perspective, while creative insight depends on exposure to multiple human viewpoints. Teachers need time to talk to each other. To reflect together. To stay anchored in the messy, human reality of their classrooms instead of disappearing into an AI-assisted workflow. Every training we run at UnconstrainED is built around this—educators working through AI together, not alone at their desks.
The Opportunity
The HBR piece ends with a line that I keep thinking about: “The question facing organizations is not whether AI will change work, but whether they will actively shape that change—or let it quietly shape them.”
That’s the question facing every school right now.
AI could give teachers more time to actually sit with a struggling student. More time to plan a lesson they’re excited about. More time to think carefully about what a child needs instead of drowning in admin.
But only if we’re intentional about it. Only if schools build the structures and conversations around AI use that prevent it from becoming another engine of burnout.
The research is clear: left unchecked, AI doesn’t give people their time back. It takes more of it. That’s true in tech companies, and it’s true in schools.
At UnconstrainED, this is the work we do every day. We help schools build intentional AI practices—not just teaching educators how to use the tools, but helping leadership teams think through the policies, rhythms, and structures that make AI sustainable for their people. If your school is adopting AI and you’re starting to feel the intensity creep in, let’s talk.
Reach out at alex@unconstrained.work.
— Alex





Leave a Reply