You can automate tasks, but you can’t automate your way out of a broken capacity model.
I saw this in corporate America: pressure to use AI, automate more, build efficiencies, and free up time for more project work, as if time were just sitting there waiting to be found by the person closest to the problem. I saw versions of the same dynamic while consulting for much smaller companies too. Different environments, same issue. Leaders were confused about why certain work was taking so long, wanted it to move faster, and did not want to pay for more hours. But once I drilled into the work itself, the answer usually was not that the team was doing something wrong. It was that they needed more hours, more headcount, or that some of what leaders wanted to happen faster was already about as efficient as it could be.
This is not just a corporate problem. It is a leadership problem. Too often, leaders push operational employees to “find efficiencies,” “automate more,” or “use AI better” while ignoring the actual dependencies involved: cross-functional coordination, systems knowledge, integration work, technical resourcing, change management, and the fact that the original work still has to get done.
When it comes to AI, it is one thing to encourage thoughtful experimentation. It is another thing entirely to act like operational employees should be able to prompt their way out of structural complexity, technical dependencies, and chronic under-resourcing.
AI Does Not Magically Remove Dependencies
A lot of operational work is shaped by organizational reality.
Maybe a task only takes an hour a week. Fine. That does not mean it is simple to automate.
If automating it requires:
- involving a systems team
- coordinating with an applications team
- confirming process ownership
- understanding an integration
- submitting a request into a queue
- waiting for scarce technical resources
- testing the output
- reviewing edge cases
- documenting the new process
That kind of cross-functional change work requires leadership. It requires prioritization, resourcing, coordination, tradeoffs, and reality-based decision-making.
What it does not require is a manager vaguely gesturing at AI and acting confused about why a process still takes time.
The People Closest to the Work Are Being Set Up to Fail
There is a real gap between non-technical operational work and the technical work required to create automation.
AI fluency is not the same thing as being able to implement systems change across teams.
Someone can identify a problem, describe a process, and use AI to think through possible solutions. But if the actual fix involves backend integrations, application-specific expertise, or technical configuration owned by another team, then the work is no longer “just figure it out.”
It becomes dependent on whether other teams have the time, knowledge, and bandwidth to help. Usually, they do not.
The operational employee ends up in an impossible position. They are being told to create efficiency, but they do not control the conditions required to create it. They are being judged on work they cannot fully redesign. And they are often expected to continue doing the manual version of the process while also somehow inventing the automated future version of it.
Operations Gets This Pressure Differently
This pressure also lands differently on the operational side of a business than it does on the product or customer-facing side.
Operations is often treated as support work rather than strategic work. It is more likely to be underinvested in. It is more likely to be expected to absorb complexity quietly. And that lack of investment compounds the issue.
So the part of the business with the least spare capacity and the least technical leverage gets told to become more efficient, automate more, and somehow use AI better.
Meanwhile, operational work is often exactly where the mess lives: handoffs, exceptions, reporting requirements, system limitations, and cross-functional coordination.
That makes it especially frustrating when leaders treat operations like it should be able to magically self-optimize without more investment.
Managers Are Also Ignoring How Much Work Has Changed
There is another version of this that is especially demoralizing:
“I don’t understand why this takes you so long. It never used to take me this long.”
That may be true.
But was the company half the size back then? Were there fewer systems? Fewer stakeholders? Less scale? Fewer expectations layered on top of the same process?
A lot of leaders are comparing today’s work to a much simpler version of the company and then treating the employee as the problem.
Growth creates complexity. Scale creates drag. More teams, more tools, and more dependencies create more friction. Pretending otherwise does not make a leader efficient. It makes them disconnected from reality.
And this is exactly why curiosity matters.
Good Leadership Requires Curiosity
Curiosity is a hallmark of good leadership. A leader who is not curious will look at a process from the outside, assume it should be simpler than it is, and then put pressure on employees to “fix it with AI.”
A curious leader takes the time to understand how the work actually happens.
They ask:
- What is making this take time?
- What changed as the company grew?
- Which parts are repetitive?
- Which parts require judgment or coordination?
- Which parts depend on systems or teams this person does not control?
- Is AI actually useful here, or is it being used as a placeholder for a bigger operational problem?
This is the call to leaders right now: take the time to understand the work before prescribing solutions.
Sometimes the Answer Is Not AI. It Is Staffing.
This is the part leaders often do not want to hear.
Not every long task is an efficiency problem or an automation opportunity.
Sometimes the answer is that the work simply takes the time it takes.
Sometimes the answer is that the company has grown and the role has not been resourced accordingly, even if leadership would rather call that an efficiency problem than a staffing one.
Sometimes the answer is more people.
I saw this in corporate. I’ve seen it in e-commerce businesses. Very different settings, same basic reality: leadership wanted the work to take less time, but did not want to fully face what the work actually required.
What Leaders Should Do Instead
If leaders actually want teams to work more effectively with AI, they need to do more than issue vague pressure and call it innovation.
They need to:
1. Understand the real workflow before demanding efficiency
Take the time to understand where the time actually goes, what the dependencies are, and what is truly automatable. Sometimes all it takes is tracing one workflow from start to finish to see how many other teams, systems, and constraints are involved.
2. Treat automation as cross-functional work, not personal homework
If a process depends on multiple systems or teams, this is not something an operational employee should be expected to solve alone. It is organizational work, and it should be treated that way.
3. Advocate upward instead of pushing fantasy downward
Leaders need to be able to say: this is more complex than it looks, this requires technical partnership, this is not a good automation candidate right now, or this team simply needs more resourcing.
This Is Not Transformation. It Is Burden Shifting.
Too many organizations are taking underinvestment, poor systems design, staffing constraints, and operational complexity and handing it back to employees under a new label:
- Efficiency.
- Innovation.
- AI adoption.
- Future readiness.
But if your AI strategy depends on operational employees somehow creating time, technical solutions, and cross-functional coordination out of thin air, that is not a workforce strategy.
It is a responsibility problem.
And leaders who keep telling their teams to “fix it with AI” or “build efficiencies” without taking the time to understand the work are not leading change.
They are just finding a newer, shinier way to avoid doing their part. It is burden shifting with trendier language.
Want a practical way to assess whether your team is being pushed to “fix it with AI” without real support?
Download the free AI Pressure Checklist, and check out more articles below.