QA Capability Assessment Case Study | UK University

QA Capability Assessment Case Study | UK University

Building a Future-Ready QA Function Across Higher Education

To enhance the quality and efficiency of its testing capability, a major UK university partnered with Inspired Testing to conduct a comprehensive QA Capability and Maturity Assessment. The engagement evaluated the institution’s testing framework, workforce, and tooling landscape, identifying opportunities to modernise its approach, increase automation, and align more closely with Agile and DevOps practices.

Industry

Higher Education

Location

United Kingdom

Solution

Strategic Test Consulting

Team

Principal Consultant

Client Background

The client is one of the UK’s leading universities, recognised for its commitment to innovation in digital learning and enterprise technology. With a complex portfolio of in-house and third-party systems, the institution’s testing function plays a key role in ensuring operational stability, student experience, and compliance across critical business applications.

Challenge

Over the past decade, the university’s testing function had evolved from traditional, in-house software validation to managing a diverse landscape of externally hosted, third-party integrations. This transition created challenges including:

  • Heavy reliance on manual functional testing, limiting scalability
  • Inconsistent test planning and late tester engagement in project cycles
  • Limited automation and API testing capabilities
  • Fragmented use of tools across Jira and Azure DevOps
  • Lack of structured training, peer review, and measurable KPIs
  • Expanded need for Non-Functional Testing practices.

The perception of testing as a “necessary step” rather than a strategic enabler further constrained collaboration and long-term maturity.

Solution

Inspired Testing executed a QA Capability Assessment which followed a structured three-phase approach, focusing on People, Processes, and Technology. The recommendations provided by Inspired Testing were carefully selected to ensure that they were achievable by the university, with a real focus on the best return in investment possible.

The following recommendations were developed:

  • Short-Term (0–6 months)
    • Standardise test planning and defect management frameworks
    • Deliver formal Azure DevOps (ADO) training
    • Introduce structured retrospectives and visibility planning
    • Begin foundational automation through the recruitment of an Automation Engineer
    • Initiate AI Co-Pilot trials to accelerate testing and support shift-left adoption
  • Medium-Term (6–12 months)
    • Establish a formal Training and Career Growth Plan
    • Expand automation to include regression and performance testing
    • Introduce Hyper-Care post-release feedback loops
    • Enhance collaboration between testers, Business Analysts, and Solution Architects
    • Implement Power BI dashboards for KPI tracking and transparency
  • Long-Term (18+ months)
    • Fully integrate automation into CI/CD pipelines
    • Embed risk-based governance and DevOps alignment
    • Implement continuous improvement cycles and test maturity governance
    • Strengthen monitoring for third-party integrations through API contract testing

Results Before the Assessment

  • Manual regression cycles delays project delivery
  • The university’s siloed tooling and inconsistent test planning cause limited visibility and uneven testing outcomes.
  • Limited performance and API testing reduce early defect detection and confidence in system reliability.

Results to be Achieved After the Assessment

  • Standardised and transparent QA processes across projects
  • Full adoption of ADO for unified test management
  • Early tester engagement embedded in delivery lifecycles
  • Automated regression packs implemented in priority systems
  • Real-time dashboards established for defect trend analysis and test KPIs
  • Improved understanding of the testing process by the whole department.
  • Increased confidence in the test team and their ability to support delivery.

Business Impact

  • Improved Release Quality: Early involvement of testers reduced post-release defects by ensuring alignment with development sprints.
  • Increased Efficiency: Standardisation and automation cut manual testing time and improved throughput.
  • Enhanced Visibility: Dashboards and metrics provide real-time insights into testing effectiveness and monitor improvement over time.
  • Cultural Shift: Testing isevolving from a support role to a strategic quality partner within IT delivery.
  • Sustained Maturity: A long-term roadmap enables continuous improvement and governance for third-party integrations.

Why Inspired Testing

  • Extensive experience in QA capability and developing maturity assessments across large, complex organisations.
  • Has a clear, phased roadmap approach to improve people, process, and technology within the QA function.
  • Combines strategic consultancy with hands-on delivery expertise to ensure practical, achievable outcomes.
  • Delivers independent, objective insights into existing challenges and opportunities for QA improvement.
  • Recognised for their collaborative approach, ensuring alignment with internal teams and organisational goals.
  • Provides, structured, standards-based assessments aligned with ISTQB and ISO quality frameworks.