NAVikram GopinathE & Uday Ravi Kempegowda, McAfee India Private Limited
Achieving Quality in a Complex Environment can be very difficult. A main factor in the “complex” environment is that there is so much to focus on but so little time. With methodologies like agile development gaining significance by the minute, it is very hard for QA to keep up with the rapid pace of the project. Similarly, development also is finding it hard to focus on the non-functional aspects of the product like performance and security.
This paper outlines the approach that we at McAfee adopted to overcome these challenges with the goal of achieving quality in a complex environment.
The paper primarily focuses on the a few practices that can reduce the time taken for performance testing during the agile project life cycle and at the same time provide equal if not better coverage of testing as compared to the traditional project life cycle.
The practices the paper will explore include:
- Setting up performance goals: While this is not something new in the agile model only, it definitely gains much more significance. What are the key aspects that we need to define very early in the project life cycle in order to come up with an objective rather than subjective list of goals? What methods can be employed in order to come up with this list? These are some of the questions that this paper will try answer.
- Eliminating performance issues: Again, this is not new, but in the context of “time” this is of very high importance. The paper will try to outline some key aspects that need to be reviewed. Also some general guidelines regarding development best practices to avoid performance issues will be highlighted.
- Automated performance testing: From one of the key agile concepts of “Test early, test often” it is evident that automation is one of the key aspects that will enable teams to shrink performance test runs and allow for repeated performance tests. Discussions around the requirements, out of a performance test automation framework, in terms of robustness and flexibility will be highlighted.
- QA skill set development: Another key area where time is spent in performance test phase is debugging the cause of performance issues. It is relatively easy for QA to test and collect performance metrics, but when an issue is encountered, the root cause of the issue is not evident right away. Development puts in a lot of effort trying to reproduce the issue and then debug the same. QA needs to develop the skills to perform initial analysis. This can be greatly helped by tooling. The paper lists some tools and their advantages along with examples of analyzing performance issues.
- Tracking performance: Tracking performance numbers on a very high frequency basis allows for narrowing down the cause of the issue. The performance issues, which are introduced into the product over builds, can be tracked and fixed as they occur.
- Involve third parties: Having Beta and JDP programs will help in ensuring production environment testing for performance. Create appropriate scenarios and enable the participants with tooling. The paper will list methods and suggestions on creating scenarios and automation that will help in delivering the product for Beta testing.
The key contents of this paper will be in these lines:
- Setting up performance goals:
- Define components and workflows targeted and work towards that target.
- Define percentage improvement for above.
- Have regular performance review meetings.
- Eliminating performance issues:
- Review of architecture diagrams and related design documents. (For new products)
- Analyze existing behavior for bulky functions and feed this back into requirements of newer versions. (For mature products)
- Examples of typical performance issues that should be avoided (like network operations during boot, encryption/decryption during boot, trying to start multiple processes without ordering them, making multiple calls for some properties, etc.).
- Allows for testing often.
- Start tests very early (have the framework ready earlier)
- Examples of a good framework.
- QA skill set improvement
- QA needs to build skill set to do initial analysis. (Saves time where dev is doing the analysis.)
- Use appropriate tooling.
- Examples from xperf.
- Tracking performance
- Run performance tests on every build (or RTQA).
- Track numbers graphically.
- Narrows down root cause tracking (to within a few svn check-ins).
- Involving JDP/Beta
- Get external tests ready to expose your app to real world scenarios.
- Agile allows for late changes, so make changes from the Beta/JDP feedback.