This is a special guest-article for the PNSQC. Paul Gerrard was recently tasked with defining a Leadership and Development Program for the Test Community in the South of Ireland.
Defining a Leadership and Development Program for the Test Community
Paul Gerrard Principal, Gerrard Consulting
Around twenty testing and QA managers had decided to look at their challenges and plan a way forward to develop and improve the testing-related training available to local testers.
What started as an attempt to create a three-day class for beginner testers became a much more substantial learning and development (L&D) program. This article describes the Tester Skills Program (TSP), why it is necessary and how it might evolve in the future.
Why a New Tester Skills Program?
The QA managers believed there was an ‘Existential Crisis for Testers’:
- Testing is obsolete: Approaches offered by training providers, books and the certification scheme(s) are outdated and no longer fit for purpose.
- Replaced by automation: A common (mistaken) perception is that testers and testing in general can be replaced by automated approaches.
- How do you add value to the team? Most testers find it extremely difficult to make a strong case.
- Titles changing – evolution of SDET role: More popular in the US than Europe, the SDET (Software Development Engineer in Test) role is a hard one to fill.
- We’re all engineers: Related to the SDET approach, testers who never wrote code (and might not ever want to) are being encouraged to learn a programming or scripting language and automated test execution tools.
- Once highly respected skillset/mindset no longer valued: Agile failed to define the role of testers well and the redistribution of testing across teams diluted the perceived value of testers in Agile teams.
- Technology changing at an unprecedented rate: Test approaches have not kept pace with the changing technology. Skills should be independent of technology, enabling testers to test anything.
The team then articulated a list of specific challenges faced by their testers and teams that became low-level goals for a Learning and Development scheme. The subsequent work on deriving a skills inventory addressed, to varying degree, these ‘skills areas’ before being compiled into an overall syllabus.
Current Certification Scheme Limit our Skills
The most prominent tester certification scheme is created and administered by the International Software Testing Qualifications Board – ISTQB . There are some minor schemes which operate, but ISTQB has become the de-facto standard in the industry.
But there are well-known problems with current certification:
- If you look at the certification scheme syllabuses (Foundation and Advanced Test Analyst, for example), the table of contents comprise mostly what we have called logistics. The certification schemes teach many of the things we say we do not care about.
- The schemes mostly offer one way of implementing testing – they are somewhat aligned with various standards. Incident Management, Reviews, Test Planning, Management and Control are all prescriptive and largely based on the Waterfall approach.
- Much of the syllabus and course content is about remembering definitions
- The syllabuses infer test design techniques are procedures, where the tester never models, or makes choices. The tester and exam-taker are spoon fed the answers rather than being encouraged to think for themselves. There is no modelling content.
- The syllabuses don’t teach thinking skills. (The word thinking appears once in the 142 pages of Foundation and Advanced Test Analyst syllabuses)
- Exams, being multiple choice format, focus on remembering the syllabus content, rather than the competence of the tester.
Certification does not improve your ability to be independent thinkers, modelers or (pro-)active project participants; the exams do not assess your ability as a tester.
This is a big problem.
Continued in Part 2.