Want a quick email when a new article is published? Subscribe
AI coding: the new economics of building software
This is part 2 of 5. Missed part 1?
Here’s a number most executives have internalised, even if they’ve never seen it stated explicitly: building software costs about 20% of its total lifecycle cost. Maintaining it costs 80%.
This ratio has driven decades of software engineering practice. We invest in clean code, documentation, refactoring, automated testing. We hire people to manage technical debt. We build systems designed to be understood and modified by future developers who haven’t been hired yet. All of this is rational when maintenance dominates the cost structure.
What happens when building gets 50-80% cheaper?
The maintenance inversion
If the cost of building drops dramatically, the logic of heavy maintenance investment starts to erode. Why spend two weeks refactoring a module when you could spend two days replacing it? Why invest in making code readable for future developers when regenerating it from specifications might be faster than understanding it?
This isn’t an argument for deliberately writing bad code. It’s a recognition that the economic basis for certain practices is shifting. The goal moves from maintainable to replaceable.
Consider the suit analogy again. A bespoke suit is constructed to last and support future alterations. The seams have extra material. The lining can be replaced. You can let it out or take it in as your body changes. The fabrics are more expensive and more durable, chosen to justify the investment in construction. This makes sense when the suit costs thousands and you expect to wear it for fifteen years.

An off-the-peg suit isn’t built that way. When it no longer fits, or the style dates, or you catch it on something and tear the fabric, you replace it. The replacement cost is low enough that investing in repair rarely makes sense.
There’s a related criticism: AI-generated code can be messy. Spaghetti code, to use the pejorative. This is sometimes true. The traditional objection to spaghetti code is that it’s hard to understand and harder to maintain. If your model is replacement rather than long-term maintenance, that objection loses most of its force. You’re not going to be reading this code for years to come. An AI agent can follow it to fix bugs and you’re only going to be running it until requirements change, then replacing it.
Not only that, but as of the end of 2025, the science shows that the code produced by AI is actually no less maintainable than that produced by more traditional means.
The fashion problem
There’s another dimension to this. Even if your expensive suit remains in perfect condition, fashion moves on. The lapel width that looked sharp in 2016 looks very dated in 2026. The fit that was stylish becomes unfashionable. You stop wearing it not because it’s worn out, but because the context changed.
Software has the same problem. Requirements shift. Integrations change. The framework you built on falls out of favour, stops receiving security updates and fresh developers don’t want to work with it. The business pivots. You might engineer a system to last fifteen years, but if the market moves in 18 months, that engineering was wasted investment.
When replacement is cheap, building for longevity becomes less valuable. Building for current requirements, with the expectation of replacement when those requirements change, becomes more rational.
The accounting implications
Traditional software sits on the balance sheet as a capital asset, depreciated over five to ten years while you invest continuously in maintenance. This model assumes the software has a long useful life that justifies the initial investment.
Replaceable software looks more like an operating expense. Build it, use it, replace it when needs change. Don’t accumulate legacy burden. Don’t spend years paying down technical debt on a system you might replace anyway.
This is genuinely new. CFOs who’ve been treating software development as CapEx may need to rethink the model. The depreciation schedule of software assets is changing because the underlying economics have changed.
Other shifts
The build versus buy calculation changes. When custom software was expensive, off-the-shelf solutions could win by default unless your requirements were truly unique. When custom software is cheap, the threshold for “worth building” drops. You can have software that fits your exact needs rather than adapting your processes to fit the software.
Experimentation becomes routine. Testing a hypothesis with working software used to be expensive enough that you’d do extensive research first. When you can build a functional prototype in days, you can test ideas that would never have justified the old cost of validation.
Duplication concerns diminish. The traditional argument against duplicating functionality is that you’re wasting effort and creating multiple things to maintain. But if the maintenance model is replacement rather than upkeep, and if generation is cheap, the calculus shifts. Sometimes a purpose-built component is simpler than managing shared dependencies.
What this means for you
If you’re responsible for technology investment, the old heuristics may mislead you. The 80/20 split that justified heavy maintenance investment is eroding. The case for building rather than buying is stronger than it was. The argument for long-lived, carefully architected systems is weaker, except perhaps for genuinely critical infrastructure.
This doesn’t mean quality stops mattering. It means quality is relative to expected lifespan. A system you may well replace in a year needs different engineering than one you expect to run for ten.
In the next article, I’ll cover the skills that still matter when AI handles the typing, and why the panic about juniors not learning to code misses the point.
If you’re wondering why you might want to listen to my opinions on this topic, see here.
Share this article
Comments
Leave a Comment
All comments are moderated and may take a short while to appear.
Loading comments...