There's a conversation I have regularly with prospective clients about partnership maturity. It usually starts with them asking: "How quickly can you get us people?"
While that's a perfectly reasonable question, what I'm really interested in is how, in an ideal world, they would like to engage with us. Because the journey from staff augmentation to true managed services is about sharing a vision, bringing tangible benefits and fundamentally transforming how testing delivers value to your business.
I've seen this journey play out almost textbook-perfectly with one of our largest clients, and it's taught us valuable lessons about patience, trust, and the real meaning of partnership.
Starting small, thinking big
Three years ago, we started exactly where most relationships begin: with small, specific project engagements. We were brought in for test automation work and performance testing – distinct pockets of activity with clear boundaries and defined deliverables.
In parallel, there was a Proof of Concept (POC) underway for Test Environment Management. Nothing was connected. We had separate streams of work, different budgets, different sign-offs. It was transactional, and everyone knew it.
When the client decided to issue a Request for Proposal (RFP) for all their software testing services, we won partly because we understood their landscape, but mainly because we'd proven ourselves in those initial engagements.
The trust wasn't fully there yet. We needed to prove our ability to scale across all testing roles and now had the opportunity to deliver. That first year was pure staff augmentation – we replaced about 100 people across several portfolios. One person out, one Inspired Testing person in. The client interviewed everyone. They made every decision. We were extra hands, not strategic partners.
But alongside that augmentation work, we were quietly maturing our relationship and bringing governance, thought leadership and innovation at every opportunity. Some initiatives such as expanding their automation capability, building regression packs and introducing more performance testing demonstrated we could think beyond the immediate tactical needs.
The pivot point
Year two things began to change. We now had domain knowledge. We used test consultancy provided through our contractual rebate model to increase our understanding of the client's business, their challenges and their technical landscape to make more informed recommendations.
We introduced a resource pool to manage flexibility more effectively and began to show clear cost benefits. Need someone moved between projects? We handled it. Skills mismatch on a programme? We sorted it. The client trusted us to put the right person in the right role.
We also solidified the use of Test Environment and Release service into projects. Project adoption varied by business area, but all began to recognise the value this team could bring. At the time, the relationship operated on a project-by-project, ad hoc basis — much like an expanded Performance Testing offering.
Consultants in the Inspired Testing performance team were deployed to the client to meet specific, sometimes very short-lived performance testing needs. Five days here, a break, five days somewhere else. We managed that complexity so they didn't have to. Providing exactly the right skills at the right time.
That year was tough – it was the final year of their regulatory period with less capital expenditure available. But because we could flex and because we could move people around efficiently, we became more than suppliers. We became trusted testing partners. Building the managed service.
By the end of year two, we had fragmented services working well – test environment management, automation run, performance testing – but they were still separate. We were also engaged in discussing and deploying a Test Data Management solution, and at that point we started talking about DEAP: Data, Environments, Automation, and Performance. Our first truly managed service.
The shift was profound. Instead of the client managing headcount and costs, they now measured us on outcomes and KPIs. They didn't care whether we used senior consultants or developed junior talent – as long as we met their performance metrics. We could resource the team entirely as we saw fit. We absorbed all the tool licenses. We took on vendor management. We provided predictable monthly billing.
For us, it meant steady income and the ability to plan. We could invest in training our people specifically for this client, knowing we had contract stability. We could bring in junior staff and give them time to become experts. We could innovate and invest in a dedicated Test Consultant allocated to the account because we had the breathing room to align strategically rather than to budget cycles.
For the client, it meant financial predictability, reduced administrative burden, and most importantly, better outcomes. They went from micromanaging testing resources to focusing on what actually mattered, measuring each service through a unified KPI framework.
To expand on this briefly, the KPI framework is designed to assess operational performance across quality, efficiency, stability and risk. Quality indicators track the effectiveness of service processes such as adherence to test data governance, environment readiness, and compliance with test entry and exit criteria.
Efficiency focuses on automation execution rates, data provisioning turnaround and environment utilisation. Stability metrics capture environment uptime, system availability during testing and response times to service incidents, while risk is managed through measures such as incident volumes, demand versus capacity and dependency management, ensuring the service remains predictable and well-controlled.
These KPIs represent the fundamental shift from activity-based reporting to outcome-based governance and form the backbone of monthly service reviews, driving informed decisions on investment and improvement.
The full transformation
This year, we're moving to a fully managed service that encompasses everything – all testers across all portfolios, plus DEAP. Next year, the client won't distinguish between different types of testing resource. They'll have one service, one relationship, one set of KPIs, and complete confidence that quality is our responsibility.
What stands out to me about this journey is how it moved from our client making resource decisions to them setting business outcomes. We have developed our delivery and engagement model to suit their business and provide a service they trust. That trust translates directly into business value.
We're faster, more efficient, more effective because we tap into our expertise without constantly seeking approval. We save them money because we optimise resources across the entire testing landscape.
The real value of maturity
This isn't unique to one sector or one client. It's where most organisations want to get to if they're thinking about testing services over the long term. But you can't rush it. You have to walk the journey together, building trust at each stage, proving capability, getting the KPIs right, and ensuring you're measuring outcomes rather than just costs.
The transformation from "can you give us people?" to "we trust you to own quality" doesn't happen overnight. But when it does, everyone benefits. The client gets better business outcomes, and we get to do what we do best: deliver exceptional testing services without the friction of outdated engagement models.
That's the maturity journey, and it's worth every step.