I’m going to skip the feature matrix. If you want a list of which tools support which file types or which IDE integrations are available, that information is readily available and changes frequently. What’s harder to find is a straight answer to the question that technology leaders actually need to make: which of these tools should we standardise on, and why?
The question I hear most often from founders and CEOs at growth-stage companies is some version of: “We need a CTO — should we hire one, or is fractional the right call?”
The honest answer is that the question usually gets framed wrong. The decision isn’t about finding a cheaper option. It’s about what the company actually needs from the role.
By week ten, the team lead was no longer reviewing code line by line. She was reviewing intent — reading AGENTS.md files, checking that the scaffolding was right, and validating that the output matched the specification. Her engineers had stopped thinking of themselves as code writers. They had become environment designers.
That shift — in role, in mindset, and in daily workflow — happened across a 20-person team in ten weeks. Not through a mandated tooling rollout, but through a structured methodology that changed what engineers were optimising for.
The engineering team was shipping good software. But each deployment cycle involved a CI pipeline that took between 45 and 70 minutes to complete. Add the time for human PR review — typically two to four hours in elapsed time, even when the actual review took 20 minutes — and the cycle from merged PR to deployed code was often half a day.
When manufacturing companies ask about AI, the conversation often goes in one of two directions. Either it’s about robotics — humanoid machines on the factory floor — or it’s about some future-state “smart factory” that requires a complete infrastructure overhaul before anything happens.
Both conversations miss where most manufacturers can actually create value in the near term.
The data problem that isn’t being framed as an opportunity
Most mid-size manufacturers are sitting on more operational data than they’re using. Equipment sensors log temperature, vibration, and pressure. Production systems record run times, output, and downtime. Quality inspection records accumulate for years. ERP systems contain procurement, inventory, and supplier data.
A pattern I see consistently in mid-size technology organisations: the leadership team is excited about AI tooling, the engineering team starts using AI coding assistants, velocity increases, and three to six months later a new layer of technical debt has formed — faster than the old one.
AI didn’t create the problem. It accelerated the existing tendency.
What actually changes with AI in the picture
The fundamental nature of technical debt doesn’t change. It’s still the cost of shortcuts taken today that create extra work tomorrow. What changes is the rate of accumulation and the categories most likely to grow.
Ninety days is not a magic number. But it is a useful constraint. It’s long enough to see real change in how a team works, short enough to sustain focus, and specific enough that you can tell at the end whether it worked.
Here is the framework I use for AI-native engineering transformations. It isn’t a rigid script — every organisation starts from a different place — but the phases and sequencing hold across contexts.
Every company I’ve worked with in the past two years has wanted an AI strategy. About half of them were ready to actually execute one. Understanding the gap between wanting AI and being ready to use it effectively is the starting point for any serious AI initiative.
This is the framework I use to assess where a company actually stands.
A technology leader I worked with recently pulled up a spreadsheet of his company’s software subscriptions. AI tools alone: fourteen line items. When we went through each one and asked who was actively using it, the answer was: two tools with meaningful adoption, three with occasional use, nine that had been evaluated and quietly shelved.
Annual spend on tools the team wasn’t using: significant. But the subscription cost wasn’t the real problem.
Six months ago I watched a founder with no engineering background ship a working B2B SaaS prototype in three weeks using Cursor and Claude. No co-founder. No agency. No contractor. The prototype had a login system, a dashboard, and enough functionality to run a dozen customer discovery calls.
This is now possible. It wasn’t two years ago. Understanding what has changed — and where the limits still are — will save you a lot of wasted time.