The role quality plays in defining a service-level agreement (SLA) for software testing is crucial. It's well-known in the software industry that it requires focus to develop great products. It is why a laser-focused SLA for software testing leads to quality success and reduces risk.
A clear SLA emphasizes quality across the entire software development lifecycle, and provides peace of mind.
It's hard to manage engagements, services, and expectations. Let's get started by taking the time to walk through how you can build an SLA for software testing. It will make things easier for everyone.
An SLA – Service-Level Agreement – is a written contract agreement between the quality team (provider) and development team (customer) to ensure everyone understands the engagement, service of work, and expectations from the provider. If specific conditions are not met there are consequences and often the quality of the product suffers.
In the real world of software development, you will discover that the combination of quality, ownership, and accountability is challenging. The development teams depend on the quality services to meet their quality objectives. The best way to improve ownership and accountability is to provide an SLA that ensures 100 percent alignment of quality goals, engagement, services, and expected outcomes. The exact metrics for each SLA vary depending on the customer quality goals and key performance indicators (KPIs). The agreement aims to establish a mutual understanding of services, areas prioritized, responsibilities, guarantees, and warranties provided by the service provider. The communication of the SLA is critical to the execution and completion of agreements at hand. It builds trust and awareness about your SLA for software testing among internal and external stakeholders.
Think about what matters to the QA team – the stakeholders: does the application meet everyone’s expectations from a quality standpoint?
The QA team needs to 1) understand everyone’s expectations; and 2) determine the current state of the application health.
That is the beauty of a well-written SLA. It helps the QA team decide whether to invest time in making the application faster and reduce the cost of release, increase quality, and enhance resilience if, in fact, those elements are what’s expected. It provides guidance on where to focus the time such as reducing testing technical debt, automating the right things, and speeding up testing in the CI pipeline, standards and processes.
The purpose of the agreement is to identify metrics, and target measurable quality results that indicate the health of the application. More importantly, the SLA is written to enhance the communication and relationship between the customer and the QA team.
Here is a guide showing how to build your own SLA.
These SLA guidelines are a good starting point for establishing trust among development project stakeholders (project manager, product owner, designers, developers, and quality assurance staff).
Engagement
The goal of the engagement section of the SLA is to describe types of guidance, tools, and support through assessments, and make them accessible to the team, enabling them to contribute to the project’s quality success. Emphasis is placed on:
Bringing quality to the forefront;
Explaining the level of quality engagement expected for the project. I.e., white-glove, hybrid, or self-service (See Services, below);
Analyzing and implementing tooling to help facilitate quality improvements and success;
The process for accessing/requesting guidance and feedback regarding existing testing practices;
Training sessions to ensure that quality knowledge and ownership are successful;
Executing research and development of new quality tool solutions, if applicable;
Identifying who is on the team. This depends, of course, on the structure of your QA department, and could include, for instance, a software development engineer in test (SDET), a quality automation engineer, and a quality specialist.
Expectations
Present a high-level overview of the engagement request.
Describe quality tool solutions for back-end, functional, performance testing, and reporting, including their potential for being robust and scalable (for example).
Explain monitoring testing environments for product scalability, reliability, consistency, and performance.
Include the time needed to do specific activities.
Emphasize that customer gets guaranteed service of quality improvements, and that the QA team has responsibilities and tasks that are precisely defined.
Help establish quality best practices, processes, and tooling related to:
Priorities
Timeline
Tools
Support
Exit plan with a slow transition plan
Activities
Examples of activities you could include in an SLA.
Quality Platform (Software Development Engineer in Test; SDET)
Activity | Timeline |
New engagement | Schedule an engagement for next sprint |
Reported incident (failure, defect found in supported quality tool | Within 2 hours |
General support | Within 4 hours |
Questions | Depending on availability, within next business day |
Scaling up a quality platform tool solution, training, and transition ownership | 15 days to four weeks or could be longer depending on the amount of work requested |
Continuous Integration Testing (Quality Automation Engineer)
Activity | Timeline |
New project requesting new testing | Included in the planning phase and providing the scope of automated testing |
Existing project requesting new, updates, or maintenance | Included in the planning phase and providing the scope of automated testing |
Test script troubleshooting | Every failure or general support requests |
Triage testing results | Every day |
Assessment | Quarterly |
Services
To focus on hardening quality solutions, increasing quality velocity, and improving quality coverage, your QA team could provide any of three levels of service to application teams.
White Glove | Hybrid | Self-service |
The initial engagement with the QA team starts as a white glove service | The SDET’s transition off the application team | QA team transitions off the application team |
QA team owns all the implementation of tools, standards, test scripts and more | Mixed quality ownership between developers, quality automation engineers, and quality specialist | Developers own the quality of the application, and quality specialist organizes exploratory testing charters |
Key Performance Indicators (KPIs)
KPIs are characterized as metrics that can be monitored for the purpose of measuring the enforcement of and conformance to the SLA. These metrics indicate how effective the SLA is in providing guidelines for achieving the quality engagement, services, expectations, and value.
Service | Quality | Value |
Scale-up time | All defects identified | Productivity improvement over time (velocity) |
Deliverable fulfilled | Test coverage | Cost per bug found decreasing |
Knowledge transfer | Speed of testing | Return on investment (ROI) per quarter achieved by automation |
Overall customer satisfaction | Percentage of tests are automated | ROI achieved by incorporating exploratory testing |
Communication | Incident reported before and after SLA | Shift-left, providing feedback faster from continuous testing |
Your SLA definitions and targets should evolve over time as you learn more about the application behavior and customer requirements.
Testing is a critical component of the success of any software development project. The SLA is a well thought-out agreement between provider and customer to protect both parties in the event that disputes arise, and to avoid misunderstandings of quality ownership.The SLA can save a considerable amount of time and money for both the provider and the customer. It will pay off in the long-term and supports good communication and relationships between the two parties.
Greg Sypolt, Director of Quality Engineering at Gannett | USA Today Network, maintains a developer, quality, and DevOps mindset, allowing him to bridge the gaps between all team members to achieve desired outcomes. Greg helps shape the organization’s approach to testing, tools, processes, and continuous integration and supports development teams to deliver software that meets high-quality software standards. He's an advocate for automating the right things and ensuring that tests are reusable and maintainable. He actively contributes to the testing community by speaking at conferences, writing articles, blogging, and through direct involvement in various testing-related activities.