At Sauce Labs, we define Continuous Testing as a “best practice approach to software quality that involves testing earlier and more often throughout the development lifecycle.” Teams that adopt continuous testing realize a number of benefits, including fewer bugs found in production, faster release cycles, and overall improved customer experience. Another rarely discussed benefit is that as the number of quality checkpoints increases, so does the volume of test data and quality signals. This data, when presented in a clear and reliable way, can go a long way in helping teams optimize and continually improve their testing practice. However, it can also be an engineering team’s worst nightmare.
Simply presenting test data in aggregate can be confusing, and sometimes even counterproductive. Teams that want to get the most out of test data analytics need to lay some best practices groundwork in order to make their data work for them. If they can do this, they can use data analytics tools to achieve the following:
Uncover quality gaps - understand where in the pipeline or application you need to apply more testing coverage, or perhaps where testing is creating a bottleneck and needs to be optimized
Understand resource allocation - are certain teams feeling stuck with long test execution times? Data insights can point out where in the pipeline those blockers are occurring, and how you can redistribute resources to teams to ensure everyone has what they need.
Celebrate success - Data isn’t just about risk mitigation, it’s also about understanding where teams are succeeding, and why that’s the case. This allows for high performing groups to be celebrated, and gives a blueprint for leadership in how they can replicate that success across the organization.
With all of these benefits in mind, here are some best practices that teams can use to harness the power of test data and ensure they get the best insights into the efficacy of their quality efforts.
This may seem like a no-brainer, but it is a crucial step for teams that want to truly understand their test data in aggregate. Setting all of your test statuses to Pass/Fail can allow you to understand where your application is working as a whole, and where there might be issues with testing. Additionally, Sauce Labs users that set their test status to pass/fail will be able to take advantage of our newest Failure Analysis feature. To learn more, please reference our documentation.
Providing an extra layer of context can make it much easier to filter those tests to view more specific data. It can also help you find groups of tests much more easily, rather than having to sift through mountains of data. Understand immediately what test you are looking at by adding additional information that is meaningful to your organization. You can comment in your tests using the JavaScript Executor.
What’s in a name? For your tests, a meaningful and consistent strategy creates context. This allows you to do things like narrow down your failure conditions, understand your test coverage, observe how specific tests are performing over time, and more. It can also provide a reliable signal for you to understand when certain tests are creating quality risks. Some tips to consider when naming your tests:
Give tests unique names. Think of them as individual snowflakes: no two should be alike!
Be specific with your naming strategy. A test name should be clear in what it is trying to accomplish, and what success looks like.
Avoid putting information in tests that don’t provide unique, human recognizable identifiers. These are things such as build or pull request IDs and browser/OS versions.
To learn more, check out this lightning talk from SauceCon 2019 on best practices for naming your tests, presented by our very own own Dylan Lacey, Software Engineer in Support.
Tests never exist in siloes. They operate in a larger pipeline that includes a number of other tools. Therefore, you don’t just want to understand how a particular test performs in isolation over time, you also want to group them together within the context of that larger pipeline. For Sauce users, this means associating your tests with Builds. This helps you group tests meaningfully as it relates to your CI/CD pipeline, and analysis can show specific areas where testing might be creating issues or bottlenecks. One of our most recent Tech Tip videos shows you how to accomplish this, or you can reference the below code sample:
Enabling your team to use the aggregate data from your tests gives your team the opportunity to optimize your testing practice and continually build digital confidence in your entire organization. Following these four best practices of setting up pass/fail data, annotating your tests, naming your tests properly, as well as associating your tests with builds sets your team up for further successes as you develop new code and testing processes. This is especially true for Sauce customers who want to take advantage of our new and improved Insights platform, which includes new features such as our machine-learning powered Failure Analysis.
For those who want to learn how Sauce Labs helps you utilize data to test better, we offer some Insights functionality as part of our 2-week free trial. Sign up today, and see how you can build digital confidence with continuous improvement.