Most automated tests perform the same exercise each time the test is run. They are typically collected and used as regression tests. Testers often think of test automation GUI based scripted regression testing. Tool vendors sell the automating of manual tests. This is a very limited view of the potentially vast possibilities for automating tests. When we think of test automation, we should first think about extending our reach – doing things that we can’t do manually. This topic is about getting past the limitations of automated regression suites and generating much more valuable kinds of test automation.
The difficult part of automation is determining whether the software under test (SUT) responds correctly. Automated tests can easily feed huge numbers of inputs to the SUT. Variation of inputs in automated tests can use data driven approaches or random number generators. In the absence of an excellent mechanism for recognizing expected SUT behavior (an oracle), verification is time consuming and extremely difficult. With an oracle, automated tests can be designed using potentially huge numbers of variable inputs to evaluate the responses of the SUT – without doing exactly the same test exercise each time.
Points the audience will take away:
Douglas Hoffman has over twenty-five years experience in software quality assurance. He has degrees in Computer Science, Electrical Engineering, and an MBA. He has been a participant at dozens of software quality conferences and has been Program Chairman for several international conferences on software quality. He designs test automation environments and automated tests for systems and software companies.
He is an independent consultant with Software Quality Methods, LLC, where he consults with companies in strategic and tactical planning for software quality, and teaches courses in software quality assurance and testing. He is a Fellow of the ASQ (American Society for Quality), founding member of SSQA (Silicon Valley Software Quality Association) and AST (Association for Software Testing), and is a long time member of the ACM and IEEE. He is Past Chair of the Santa Clara Valley Software Quality Association (SSQA) and Past Chair of the Santa Clara Valley Section of the ASQ. He has also been an active participant in the Los Altos Workshops on Software Testing (LAWST) and dozens of its offshoots. He was among the first to earn a Certificate from ASQ in Software Quality Engineering, and has an ASQ Certification in Quality Management.
WordPress website created by Mozak Design - Portland, OR
Copyright PNSQC 2020