Functional Tests as Effective Requirements Specifications
Jennitta Andrea The Andrea Group
The Test-Driven Development cycle moves functional test specification to the earliest part of the software development life cycle. Functional tests no longer merely assess quality; their purpose now is to drive quality. For some agile process, like eXtreme Programming, functional tests are the primary requirement specification artifact. When functional tests serve as both the system specification and the automated regression test safety net they must be:
Viable for the lifetime of the production code.
Easier to write than production code. If writing functional tests are a bottleneck to writing production code, they will be considered optional and quickly become incomplete and obsolete.
More correct than production code. Bugs in functional tests will create and/or mask bugs in production code. Functional tests drive what is developed, and are used to detect bugs introduced over time as the production code changes.
More readable than production code. Non-technical subject matter experts (SME) are relied upon to validate the correctness and completeness of the functional tests.
More easily and safely maintained than production code. Functional tests don’t have the same type of regression safety net as production code.
More locatable than production code. All of the relevant functional tests must be found and updated before the production code can be updated.
This talk covers these concepts as an ‘old-style’ imperative test script is refactored, step-by-step into an effective requirements specification. Emphasis is placed on developing a domain specific testing language, and other best practices for making your tests a valuable project asset. Advice is given for how to clean up an existing functional test suite.
Jennitta Andrea has expanded the vocabulary and imagery associated with agile methods to include Cinderella, step sisters, dental floss, sushi, fingerprints, and self-cleaning ovens. When she joined her first XP project in 2000, Jennitta wondered: “Will we stop using UML and use cases completely?”; “Will the analyst and tester roles become extinct?”; “Where do user stories come from”; and “What does an effective functional test really look like?”
As a multi-faceted hands-on practitioner on over a dozen different agile projects since then, Jennitta has been a keen observer of teams and processes, and has discovered answers to these types of questions. She has written many experience-based papers for conferences and software journals, and delivers practical simulation-based tutorials and in-house training covering: agile requirements, process adaptation, automated functional testing, and project retrospectives.
Jennitta is especially interested in improving the state of the art of automated functional testing as it applies to agile requirements; she applies insights from her early experience with compiler technology and domain specific language design. Jennitta is serving her second term on the Agile Alliance Board of Directors, is a member of the Advisory Board of IEEE Software, and has assisted on several conference committees. She has a B. Sc., Computing Science (Distinction) from the University of Calgary.
When we think about testing on agile teams, we usually assume the testers are fully integrated team members, working side by side with programmers and customers, helping to identify and execute acceptance tests. What do you do when your company has an independent test team? How do you integrate that team into the agile process?
Lisa will share her own experiences as a tester on agile teams, and what she has learned from other teams. She will discuss how to address cultural, organizational, technical, and logistical issues when transitioning to agile. She’ll cover tips on how to handle areas such as defect tracking, lack of detailed requirements and the quick pace of delivery. This is intended to be an interactive discussion, so come prepared with your questions.
Lisa Crispin is the co-author, with Tip House, of Testing Extreme Programming (Addison-Wesley, 2002). She is currently a tester on an agile team using Scrum and XP at ePlan Services Inc. in Denver, CO, and has worked on agile teams developing web applications since 2000. You can often find Lisa at agile- and testing-related conferences, user group meetings, and seminars in the U.S. and Europe, helping people discover good ways for agile teams to do testing, and for testers to add value to agile teams. She contributes agile testing articles to publications such as Better Software Magazine, Methods and Tools and Novatica.
You are a quality professional, creative, intelligent, and insightful. Part of your job is to improve your organization. You identify a need, envision an improvement, and make your proposal.
Someone to your right says, “But we tried that before, and it didn’t work.” Someone to the left says, “But we’ve never done that before.” Right in front of you, a third person says, “But that’s no different from what we’re doing now.” From the background, you hear a rising chorus of, “But we don’t have time!” You’re getting resistance. Now what do you do?
In this presentation, we will explore an approach that works — crank up your curiosity and empathy!
Whatever else it may be, resistance is information— information about the values and beliefs of the people you are asking to change, about the organization, about the change you are proposing, and about yourself as a change agent.
This presentation is about how to turn resistance from a frustration into a resource. You will learn and create new ways to interpret people’s responses as valuable information, and new ways to translate that information into effective action to move forward with change.
Dale Emery helps software people lead more effectively to create greater value for their customers, for their colleagues, and for themselves.
Dale has worked in the software industry since 1980, as a developer, manager, process steward, trainer, and consultant. Since 1995, he has consulted to IT and software product development organizations about leadership, software development, process improvement, software testing, project management, and team and interpersonal effectiveness.
Dale’s personal mission is to help people create value, joy, and meaning in their work. He especially enjoys helping people discover and apply untapped talents to serve their deeply held values.
We are all familiar with optical illusions: we see something that turns out to be not as it first appears. Isn’t it strange that some part of our mind knows that another part of our mind is being deceived?
However, we are subject to self-deception in technical areas as well: these are cognitive illusions. This presentation explores some of the ways in which we deceive ourselves and why we do it. Examples are taken from the way Inspection is often practiced, testing issues, attitudes toward complexity, and the way in which “groupthink” can influence technical decisions.
There are a number of ways in which we “turn a blind eye” to issues which are vitally important such as quality and planning. Addressing these issues may help to explain why measurement programs often fail, why post-project reviews are seldom done, what causes anxiety for developers, managers and testers, and how to counteract a blame culture.
Dorothy Graham is the founder of Grove Consultants, which provides advice, training and inspiration in software testing, testing tools and Inspection. Dot is co-author with Tom Gilb of “Software Inspection” (1993), co-author with Mark Fewster of “Software Test Automation” (1999) both published by Addison-Wesley, and co-author with Rex Black, Erik van Veenendaal and Isabel Evans of “Foundations of Software Testing: ISTQB Certification” Thompson, 2007.
Dot was Programme Chair for the first EuroSTAR Conference in 1993. She is on the editorial board of the Better Software magazine. She was a founder member of the UK ISEB Software Testing Board, which provides Foundation and Practitioner Certificates in Software Testing. She was a member of the working party that produced the ISTQB Foundation Syllabus, and is a member of the UK Software Testing Advisory Group. She is a popular and entertaining speaker at international conferences worldwide. Dorothy received the European Excellence Award in Software Testing in 1999.