Defining data pointer for software test efficiency measurement

A well-designed software solution needs a well-designed test approach to ensure quality. Measuring software quality has never been an easy equation to solve. When new software is in the architecture phase, its associated test case set is designed assuming certain user patterns and use cases. Relying on the test set covering the assumed use cases is an inadequate indicator for appropriate software test coverage.

Code coverage tools are inadequate to bring in confidence that post-release no production issues will be reported. Besides, code coverage measurement needs advanced tools and special builds and set up to measure them.

There is no simple to use measurement technique that can be used to gauge current software testing efficiency, leading to error-prone tracking of testing effort which leads to an inconsistent effort to outcome mapping.

In this paper, we outline the measurement technique we developed to measure Quality Volatility which helps us gauge product stability and its anticipated performance in upcoming releases. We also propose how an individual test case efficiency can be calculated, which helps in the timely review of test cases for efficacy. This is based on the detection efficiency aging model we have developed. Test efficiency measurement

Test case effectiveness measurement

Measuring the Quality stability of the product in an easy to use the format

Vittalkumar Mirajka, Sr Software Manager, Skyhigh Security

Vittalkumar Mirajkar has a Degree in Electronics and Communication Engineering. He has been with the McAfee India team since 2006 and has experience across different domains covering Tech Support, Research, and Testing.

He is an experienced exploratory tester and has a wide range of testing experience ranging from Device Driver testing, application testing and server testing. He specializes in testing security products, ranging from Anti-Virus, Firewall, Hooking and Injection, Data Loss prevention product lines etc., to name a few. His area of interest are Performance testing, Soaktesting, Data Analysis and Exploratory testing. He has been actively working in bringing newer data-driven test techniques to detect early bugs. He has vast experience in testing both Consumer Security Products as well as Enterprise Security Products. Vittal has expertise in testing intercompatiblilty issues when multiple security products are deployed together.

Contact Vittal at vittalgm@gmail.com

Sneha Mirajkar, Software Engineer, Cisco

Sneha has 12+ years of leadership experience in software testing, technology, project leadership role and successful delivery experience .
She is technically sophisticated with a career reflecting different stages of QA life cycle, aspiring leadership qualities, coupled with hands-on experience in automation over a span of 10+ years such as, using PYTHON, Selenium, PERL, QTP, VBscript, Web Application/Web services testing and functional testing. She has also experience in cloud testing (SAAS) and IAAS, AWS Applications, Testing Android applications. Shena has proven track record in working Product portfolio with diverse technologies.


Sneha has strong hands on experience in all gamuts of functional and Non functional testing like UNIT testing and performance. Sneha has successfully used Agile and Lean Methodologies to drive efficiency in Testing projects.


https://drive.google.com/open?id=1gOdupe_keV8oIdr5DxagJK1XhP0VNnA-
https://www.linkedin.com/in/sneha-mirajkar-7985b127/