If you’re leading enterprise transformation, you need to understand what’s happening to your workforce, your tech stack, and your competitors right now. Three waves of enterprise IT have systematically created and destroyed entire job categories. The third wave is accelerating, and it will reshape your organization whether you’re ready or not.

  • IT 1.0 built careers around custom in-house software. IT departments were strategic assets staffed with programmers and system operators who created bespoke systems mapping human processes to software workflows.
  • IT 2.0 outsourced that intelligence to hundreds of SaaS solutions, creating a fragmented modular stack that spawned an entire ecosystem of “glue work” roles – SaaS admins, implementation engineers, system integrators, data analysts reconciling mismatched systems. Anthropologist David Graeber called these “bullshit jobs”: work that exists mainly because systems can’t talk to each other.
  • IT 3.0 is dissolving this glue layer with AI-native systems and agents that can draft, coordinate, and produce outcomes without predefined workflows. The bullshit jobs of IT 2.0 are first on the chopping block. Just as CAD tools erased armies of draftsmen, or UBS’s trading automations emptied an entire stadium of trading floor in Stamford in 2012, today’s AI agents are already hollowing out sales ops, recruiting coordinators, junior devs, and SaaS admins.

Your strategic decisions today determine whether you’re building the next wave of infrastructure or clinging to a dying paradigm. Here’s what you need to know.

From IT 1.0 to IT 3.0

IT 1.0 – Building and Owning the Stack

Before the Internet and cloud computing, enterprises staffed full IT departments and procured from software vendors like IBM, Oracle, and SAP to build and run in-house systems. These were expensive and specialized, but tightly integrated with the business. This software translated existing human processes into workflows defined in bespoke software living on top of a database.

IT departments were core strategic assets, staffed with programmers, system operators, and managers who created bespoke systems. Jobs in this era were directly tied to creating or maintaining foundational business infrastructure. This period gave us foundational software development methodologies like Agile, born from Chrysler’s internal payroll system project. The work was complex and expensive, but it was essential.

IT 2.0 – SaaS and the Bullshit Job Boom

The 2000s marked the era of SaaS, with software now delivered over the Internet. Companies shifted from building internally to subscribing. This democratized access to powerful tools like cloud-based CRMs, HR suites, and ERPs. This enabled consumption-based pricing and product-led growth business models. At the same time, this created a new systemic problem: a fragmented modular stack of hundreds of applications that are siloed. Data silos and operational silos emerged everywhere, with Excel files pushed via email to glue everything together.

This fragmentation gave birth to a massive ecosystem of what David Graeber termed “bullshit jobs” – roles that exist primarily to service the friction between systems. These aren’t jobs that create direct value; they exist to pay the “information organization tax.” Duct tapers patch half-working systems together with shoddy code or send Excel files via email. Box tickers send Excel sheets to half a dozen folks for check-off, creating the appearance that something productive is being done when it is not.

This boom in “glue work” included:

  • System Administrators: Entire careers built around configuring, managing, and patching platforms like Salesforce or Workday.
  • System Integrators: Specialists whose job was to connect one SaaS tool to another and migrate data between them.
  • Data Entry & Junior Analysts: Armies of people hired to manually move data from spreadsheets and PDFs into rigid SaaS formats.
  • Operations Roles (Sales Ops, HR Ops, Rev Ops): Professionals who spend their days coordinating approvals, managing handoffs, and bridging gaps that APIs and integrations never quite solved.

Graeber’s notion of bullshit jobs became reality: people spending careers moving data from one rectangle on their monitor to another. This wasn’t work in the economic sense of creating value – it was a side effect of SaaS modularity and weak interoperability.

The information organization tax was massive. Meetings, cross-departmental handoffs, redundant reporting – all to coordinate intelligence scattered across silos.

This era also saw the rise of IT Consulting and Outsourcing, with inherent misaligned incentives. Shoddy software was developed to maximize overall contract value, including system integration, data migration, and ongoing maintenance contract renewals. The now-hollowed-out IT departments were no longer technical enough to ensure quality of work. Boeing’s outsourcing of its software design through layers of sub-contracting was the biggest showcase of this failure.

This entire industry — admins, ops, consultants — was built on a foundation of systems that couldn’t talk to each other. AI is now removing that foundation.

IT 3.0 – AI Dissolves the Glue

The opportunity: Organizations that transforms first will operate with half the headcount and twice the velocity of their competitors.

The AI-native wave is fundamentally different. Instead of creating more silos, AI agents and copilots are dissolving the “glue” that holds the fragmented IT 2.0 stack together. These systems can draft workflows, translate data between formats, and execute complex processes across multiple tools without human intervention or mediation.

Just as CAD tools turned hundreds of draftsmen into a handful of designers with software, or as UBS shuttered its massive Stamford, Connecticut trading floor in 2012 after algorithmic trading made human traders redundant, AI is now dismantling the SaaS-created glue layer. All the workflow definitions and playbooks are becoming obsolete and will be codified within the model and tools.

UBS Stamford

IT 3.0 software now requires environments and infrastructure for agents to operate, instead of translating human-centric processes from the 1900s into workflows – the era of agent-computer interfaces.

These roles are being eliminated now, not in some distant future:

  • SaaS admins → AI copilots auto-generate workflows, reports, and integrations. If you’re still hiring Salesforce admins, you’re building the wrong team.
  • Recruiting coordinators → Chatbots already schedule interviews and screen resumes. This role has maybe 18 months left.
  • Entry-level developers → Code assistants handle glue code and CRUD apps. Your hiring funnel should reflect this.
  • Sales ops & BDRs → AI personalizes outreach and processes leads at volume. Manual outreach doesn’t scale anymore.
  • Finance & HR ops → AI reconciles invoices, updates HR records, and generates compliance docs. Every manual handoff is a liability.

The changes ars accelerating because the barriers to adoption are collapsing. IT 1.0 required massive capital expenditure and months of implementation. IT 2.0 reduced cost but still required lengthy training and change management. IT 3.0 collapses time to value: it takes seconds to issue commands to ChatGPT instead of months training super users on PeopleSoft for adminstrators or EPIC for healthcare. When adoption barriers disappear, displacement accelerates.

Which IT 2.0 Tools Are Most at Risk?

AI will hit hardest where SaaS tools created clerical overhead:

  • CRM (Salesforce, HubSpot) – lead enrichment, pipeline updates, report generation: all ripe for AI automation.
  • ATS & HR platforms (Workday, Greenhouse) – resume parsing, candidate scheduling, payroll entry: trivial for AI.
  • Customer support platforms (Zendesk, ServiceNow) – tier-1 support is already being offloaded to LLM agents.
  • Project management (Asana, Jira, Monday) – task creation, updates, and cross-tool syncing will be handled by AI copilots, reducing the need for ops roles.
  • Finance/ERP (NetSuite, SAP) – invoice matching, expense categorization, forecasting: automatable.

The SaaS platforms may survive, but the job ecosystems around them will not.

AI Transformations

One of the biggest challenges in AI transformation is how we measure. Poorly defined metrics will incentivize the wrong behavior.

Take engineering as an example:

  • Lines of code written by AI
  • Weekly active users on Cursor
  • Percentage of PRs reviewed by AI

These metrics measures only adoption and utilization of AI tools and incentivize activity. These measures activities and outputs, not impacts and outcomes. People are equipped with AI tools but operates the same way as they did before ChatGPT. Cargo-cult AI transformation.

What actually matters and should be tracked are:

  • Product lead time: from PM idea inception to production.
  • Ticket resolution time: Time to resolve request and support tickets
  • Change fail percentage: how often do your deployments blow up and require hotfixes.

These aren’t novel ideas; they’re adapted from DORA metrics. But implementing them requires serious platform investment in analytics and observability.

And if we’re being realistic, most engineering hours aren’t spent building features. They’re occupied by operational overhead and enterprise architecture complexity, with interdependencies between services, org silos, and coordination tax. The microservices dream turned into a distributed monolith nightmare.

MIcroservices MEME

This problem isn’t unique to engineering. Every function needs to figure out what velocity means for them, and the answers are completely different. Legal measures contracts processed per month, not contracts reviewed—velocity over volume in progress. Marketing tracks campaign velocity, concept to launch—how fast can you test and iterate? Sales optimization is deal cycle time, qualified lead to close, not pipeline size. Finance cares about close cycle time and days to financial insights. Same transformation, completely different metrics.

This is why centralizing AI transformation is very hard to do well. Many companies are setting up Chief AI Officers and AI Enablement Engineering Teams to “manage” the IT 3.0 shift. This creates exactly the wrong dynamic. One overwhelmed function while everyone else waits for direction and navigates bureaucracy. You end up with coordination overhead on top of your existing coordination overhead.

For companies actually making the IT 3.0 transition work, every executive owns AI transformation for their domain. Legal, finance, marketing, sales, engineering. Each has dedicated teams and executive accountability to transform their function from within. This is a business transformation, not IT transformation. The Chief AI Officer org chart is IT 2.0 thinking applied to an IT 3.0 problem.

Measuring AI Transformation with 4-Sets Framework

To measure AI Transformation, we can leverage the 4-Sets Framework used in the early days of Big Data - mindset, toolset, skillset, and datasets. This helps us to analyze,

Mindset: Getting Comfortable with Probabilistic Systems

AI is fundamentally different from deterministic software. Same input, different outputs. “Correct” is contextual, not binary. This breaks every QA process and approval workflow designed for deterministic systems. When AI works 95% of the time, quality control becomes exponentially harder. Most organizations can’t get comfortable with “95% accurate” instead of “100% correct.”

The hardest concept for product teams to grasp is that AI makes shiny demos trivially easy and production deployment brutally hard. Compelling demos get built in days. Production at scale—with acceptable error rates, latency, and cost—takes months. The gap between “it works in the demo” and “it works at 10 requests per second” is where most AI projects die. OpenAI built their Agent Builder in 6 weeks with primitive user experiences. This made investors realize that n8n is the actual category leader and allows it to raised $180m at $2.5B valuation.

This requires a cultural shift most companies aren’t prepared for. You have to allow failures to happen. Innovation requires experimentation. Experimentation requires accepting failure. If your culture punishes failed AI experiments the same way it punishes product failures or production outages, nobody will innovate. There will be theater. Cargo-cult AI adoption where engineers use Cursor exactly like they use VSCode. Weekly active users up, changes in productivity flat. All form, no function. The shift from “zero defects” culture to “fast iteration” culture is the hardest change to make, and it’s the one that determines whether transformation succeeds or becomes expensive theater.

Toolset: Give Your People AI, Then Get Out of the Way

The toolset question isn’t “what should we build?” It’s “what do we give employees so they can experiment?” Enable the early adopters. Most organizations approach this backwards. They lock down AI access while forming committees to “evaluate use cases.” By the time the committee finishes, your competitors have six months of experimentation learning ahead of you. GPT 3.5 has now became GPT 4, and you just wasted one full generation of AI progress. Just look at the adoption curve and its not hard to realize that one day in AI is seven days in software. The progress of AI research and engineering is the hyperbolic time chamber in Dragon Ball.

One day in AI is seven days in software

Start with access. Enterprise licenses for Claude, ChatGPT, coding assistants like Cursor or Windsurf. The ROI comes from letting your team discover what works instead of trying to predict it from a conference room.

Remove approval bureaucracy for internal tools. If an engineer wants to try auto-generating test cases, or marketing wants to experiment with campaign copy variations, they shouldn’t need VP sign-off. Create guardrails — what data can’t leave the organization, what decisions need human review — then let teams iterate within those boundaries. The organizations winning at this have clear rules and fast iteration, not slow approvals and perfect safety.

The infrastructure and platform shift is fundamental. You need environments where AI can actually operate — API access to internal systems, data pipelines that AI can query, workflows that can be triggered programmatically. If your systems only work through web UIs and manual clicking, AI can’t help much. This doesn’t mean rebuilding everything overnight, but it does mean every new system should be designed for programmatic access first, human interfaces second. Build for agent compute interfaces.

Build safe sandboxes. The real blocker for AI is that nobody can access production data to try things. Create environments with representative data where people can experiment without going through 20 levels of approval or risking compliance violations. Sanitized customer data, recent transaction samples, realistic test cases. Make it real enough to be useful, safe enough to be accessible.

Skillset: Developing AI Literacy in Your Existing Workforce

The skillset question is challenging. It’s not who should we hire, but how do we develop our existing people. There’s institutional knowledge and domain expertise sitting in the current workforce. Replacing them with AI-native hires means throwing away context that took years to build.

Start with AI literacy training, but make it practical. Nobody needs another 30 minutes video on “what is generative AI.” They need hands-on practice using AI tools for their actual work. Give your legal team access to contract analysis tools and let them discover what works. Let your sales ops team experiment with lead scoring. Let engineers try code generation on real tickets. Learning happens through doing, not watching presentations.

The ratio of architects to operators is inverting. You need more people who can define what should be automated and govern how it operates, fewer people executing repetitive workflows. Some of your SaaS admins can become platform engineers if you invest in their development. Some of your operations coordinators can become exception handlers and strategic decision-makers. But this requires intentional churning and upskilling, not just telling people to “learn AI.”

DevOps taught us the “shift left” movement—pushing responsibility to development teams instead of operations teams. IT 3.0 accelerates this dramatically. You need platform engineers building infrastructure that enables AI to operate, not program managers coordinators managing handoffs between silos.

Shifting responsibility to the Left for AI Agents

Retrain when someone has deep domain knowledge but outdated execution skills. For example, retrain that SaaS admin who knows every corner case in your business rules. Restructure when the role itself is pure coordination overhead with no domain expertise, e.g., data entry coordinators, implementation engineers patching systems together, junior developers writing glue code.

The roles that survive require one of three things: deep technical expertise (machine learning engineers, platform engineers, infrastructure architects), deep context and judgment (exception handlers, strategic decision-makers), or genuine human connection (relationship building, complex negotiation, empathy-driven work). Everything else is getting automated, and your workforce development strategy needs to account for this reality instead of pretending you can train your way around it.

Dataset: The Enterprise Data Reality Nobody Wants to Talk About

Here’s the uncomfortable truth: most enterprises have tons of data and almost none of it is useable for AI reasoning. SaaS work in silos - Salesforce can be run without talking to Workday with humans bridging the gaps. AI can’t. Reasoning engines need comprehensive cross-functional context to make decisions, and your data is scattered across dozens of systems with inconsistent schemas, undocumented business logic, and quality issues nobody has prioritized fixing because “it works fine for reporting.”

The gap between “we have the data” and “AI can reason with our data” is measured in quarters or years. You need historical decision rationale, not just transaction logs. Relationship graphs between entities, not just foreign keys. Temporal context showing why things changed over time. Cross-functional workflows documenting how sales, legal, and finance actually interact, not the idealized process in the wiki nobody updates.

This is unglamorous infrastructure work that doesn’t demo well but blocks everything else. Data quality becomes infrastructure, not a nice-to-have. Someone needs to own making datasets useable, not just available. Start with safe sandboxes where teams can experiment with representative production data without 20 levels of approval. Prove value with sanitized data, then earn access to more sensitive datasets through results, not presentations.

Build infrastructure that enables experimentation without exposure. Clear governance and guardrails on what data can’t leave the organization and what decisions need human review, then let teams move fast within those boundaries. Companies that solve this first will have compounding advantages as their AI systems get smarter from accumulated context while competitors are still filling out data access request forms.

The Strategic Warning: Don’t Create AI Slop Janitors

Rushing to implement AI without redesigning workflows creates worse jobs than the ones you’re eliminating.

While AI eliminates IT 2.0’s glue work, poorly implemented AI is creating its own category of bullshit jobs: AI Slop Janitors. This happens when organizations bolt AI onto existing processes instead of rebuilding from first principles.

Look at the content industry: writers who once led creative teams now edit ChatGPT’s robotic prose for 1-5 cents per word (versus 10+ cents for original writing). They fix the same formulaic mistakes daily – removing “delve” and “nevertheless,” fact-checking hallucinations, making text sound less awkward. The absurdity peaks when freelance platforms use AI detectors while simultaneously hiring people to make AI text undetectable. Human workers are being brought in to fix what AI gets wrong.

This pattern is emerging across industries:

These aren’t valuable human-in-the-loop systems. They are temporary workers thats used to clean up AI’s mess, and once AI learns these, these jobs will be replaced. Organizations creating these roles are wasting capital on the wrong side of the transition. If your AI implementation plan includes hiring “AI quality reviewers” or “AI content editors,” you’re implementing AI wrong.

Conclusion

The transition from IT 2.0 to IT 3.0 is messy and accelerating. The glue work jobs are disappearing whether you’re ready or not. But the replacements aren’t automatically better – poorly implemented AI creates worse bullshit jobs than the ones being eliminated. AI Slop Janitors stare into the abyss and fix the same robotic mistakes or the same vibe coded apps or workflows.

Organizations that move decisively will operate with half the headcount and twice the velocity of their competitors. Those that don’t will find themselves either:

  1. Carrying dead weight in IT 2.0 roles while competitors move faster
  2. Creating AI Slop Janitor positions because they bolted AI onto broken processes
  3. Disrupted entirely by AI-native competitors who rebuilt from first principles

The opportunity is real, but narrow. As AI eliminates the information organization tax, capital and talent can shift to work that genuinely requires human judgment, deep context, and strategic thinking. But this shift won’t happen organically – it requires deliberate choices about team structure, workflow redesign, and where to compete.

Your organizational priorities:

  • Assess org maturity using the 4-Sets transformation framework: Mindset, Dataset, Toolset, Skillset
  • Redesign workflows from first principles for AI, not bolting AI onto existing processes
  • Distribute AI ownership across functions – every team owns their domain’s transformation
  • Invest in platform engineering and AI infrastructure, not more ops coordinators
  • Build safe sandbox environments for experimentation without approval bureaucracy

The bullshit jobs aren’t disappearing – they’re being replaced by different bullshit jobs. The question is whether you’re building the infrastructure that eliminates them, or whether you’re creating the next generation of make-work. Choose fast, because your competitors already are.

References

Articles & Blog Posts

Job Postings

@article{
    leehanchung_bullshit_jobs,
    author = {Lee, Hanchung},
    title = {The End of "Bullshit Jobs": From IT 1.0 to the AI-Powered 3.0 Era},
    year = {2025},
    month = {09},
    day = {19},
    howpublished = {\url{https://leehanchung.github.io}},
    url = {https://leehanchung.github.io/blogs/2025/09/19/bullshit-jobs/}
}