acceptance testing Fig. 2.2, 16–17, 42, 46, 51–52, 150
Agile development methodologies 18, 44, 146
and fundamental test process 24–25
alpha testing 52
analytical approaches 151
audit trails 64
see also traceability
authors 67
beta testing 52
big-bang integration 47
black-box techniques see specification-based techniques
bottom-up integration Fig. 2.5, 48–49
builds 186
burn-down charts 25
Capability Maturity Model Integration (CMMI) assessment 212
capture playback tools 193–194
Certified Tester Foundation Level (CTFL)
examination technique 227
sample paper 227
CFGs see control flow graphs (CFGs)
changes
regression testing 19, 22–23, 25, 44, 54, 55
cloud storage 178
code
control flow graphs Figs. 4.8–4.9, 107–109
flow charts Figs. 4.4–4.7, Fig. 4.10, 103–105, 111
instrumentation 198
pseudo 100
reviewing 62
see also structure-based techniques; work-products
comparators Tab. 6.5, 177, 192–193
completion criteria
coverage measures as 83
see also exit criteria
component integration testing 49
component testing Fig. 2.2, 42, 46, 150
configuration management 148, 168
configuration management tools Fig. 6.3, Tab. 6.1, Tab. 6.5, 186–187, 188
confirmation testing see retesting
consultative approaches 151
contract acceptance testing 52
control flow graphs (CFGs) Figs. 4.8–4.9, 107–109
simplified Figs. 4.15–4.18, 100, 121–125
see also hybrid flow graphs
control flow models 53
control structures Fig. 2.4, 47
see also program structures
conversion testing 55
cost escalation model Fig. 1.3, Tab. 1.1, 17, 41
costs
of errors Fig. 1.3, Tab. 1.1, 17
test tools Fig. 6.1, Fig. 6.4, 177, 196, 203, 214–215
coverage see test coverage
coverage measurement tools Tab. 6.5, 198–199
CTFL see Certified Tester Foundation Level (CTFL)
cyclical development models see iterative development models
data-driven testing 194
data quality assessment tools 204
databases
debugging 14
decision testing Fig. 4.14, 117–121
decisions see iteration structures; selection structures
defect lists 127
defect management tools see incident management tools
clustering 18
early identification 17, 62–63, 71, 189
finding by performance testing 202
finding by reviews 63
finding by static analysis 72, 189
raising constructively 27
see also errors
design see test design
developers
modelling tools Tab. 6.5, 189–190
pair programming 69
as testers Tab. 5.1, 14, 26, 43–44, 46, 145
use of test tools Tab. 6.5
documentation
test plans Fig. 5.2, Tab. 5.2, 152–155
see also reviews; specifications; work-products
dynamic analysis tools Tab. 6.5, 200–201
dynamic approaches 151
Dynamic Systems Development Methodology (DSDM) 44
dynamic testing 14
entry criteria
reviews 65
equivalence partitioning 87–89
‘error–defect–failure’ cycle 17
errors
absence of 19
clustering 18
costs of Fig. 1.3, Tab. 1.1, 17
see also defects
‘errors of migration’ 17
events 95
examination
sample paper 227
technique 227
executable statements 101, 105
testing Figs. 4.10–4.13, 110–117
execution postconditions 80
execution preconditions 80
execution, test see test execution
exit criteria
evaluating 23
test activities 157
see also completion criteria
experience-based techniques 84, 85, 126–127
expert-based estimation 157, 158
exploratory testing 127, 151, 194
factory acceptance testing 51
failure lists 127
fault attack 127
field testing 52
flow charts Figs. 4.4–4.7, Fig. 4.10, 103–105, 111
formal reviews Fig. 3.1, 64–67, 69–70
Foundation Certificate see Certified Tester Foundation Level (CTFL)
FTP see fundamental test process (FTP)
functional requirements 50
functional specifications Figs. 2.1–2.2, 42, 49–50
fundamental test process (FTP) Figs. 1.4–1.5, 20–25
heuristic approaches 151
high-complexity measures 71
hybrid flow graphs Figs. 4.10–4.13, 111, 113
impact analysis 55
implementation
of test tools Fig. 6.6, 212–219
see also test execution
incident management 22, 165–167
incident management tools Fig. 6.3, Tab. 6.5, 177, 183–185
incident reports Fig. 5.4, Tab. 5.4, 166–167
incremental development models see iterative development models
independent testing Fig. 5.1, Tab. 5.1, 145–146
instrumentation code 198
integration strategies
big-bang integration 47
bottom-up integration Fig. 2.5, 48–49
top-down integration Fig. 2.4, 47–48
integration testing Fig. 2.2, 42, 46–49, 150
interfaces
between tools 179
see also integration testing
International Software Testing Qualifications Board (ISTQB) 1
interoperability testing 53
‘invalid’ transitions 95
ISO/IEC/IEEE 29119 Software Testing 3–4
iteration structures 101, 102–103
visual representations Fig. 4.6, Fig. 4.16, 107
iterative development models Fig. 2.3, 43–44
keyword-driven testing 194
limited entry decision table 92
linear development models
waterfall model Fig. 2.1, 39–40
load generators 202
load testing tools Tab. 6.5, 202–203
visual representations Fig. 4.6, Fig. 4.16, 107
management information 178, 183, 184
managers
review 67
master test plan Fig. 5.2, Tab. 5.2, 152, 153–155
maturity, organisational 212
menu structure models 53
methodical approaches 151
metrics
coverage 83
incident reports Fig. 5.4, Tab. 5.4, 166–167
static analysis 71
test progress Figs. 5.3–5.4, 160
test summary reports Tab. 5.3, 23, 163–164
metrics-based estimation 157, 158
migration 55
model-based approaches 151
modelling tools Tab. 6.5, 189–190
monitoring see test progress monitoring
non-executable statements 101, 105
non-functional failures 12
non-functional requirements 50
non-functional testing 53
‘null’ transitions 95
operational acceptance testing 51–52, 150
organisational maturity 212
pair programming 69
Pareto principle 18
payback models, test tools Fig. 6.1, Fig. 6.4, 177, 196
performance models 53
performance testing tools Tab. 6.5, 202–203
personnel
test managers 28, 143, 148–149
see also developers; testers
pilot projects 215
plain language specifications 53
planning see test planning
plans, test Fig. 5.2, Tab. 5.2, 152–155
preventative test approach 150
prioritisation 13
process-compliant approaches 151
process flows 53
process improvement 164, 166, 184
program specifications Figs. 2.1–2.2, 42, 46
control flow graphs Figs. 4.8–4.9, 107–109
decision testing Fig. 4.14, 117–121
flow charts Figs. 4.4–4.7, Fig. 4.10, 103–105, 111
see also structure-based techniques
progress data 160
progress monitoring Figs. 5.3–5.4, 159–160
project test plan see master test plan
prototyping 44
pseudo code 100
Rapid Application Development (RAD) 44
Rational Unified Process (RUP) 44
reactive test approach 150
regression-averse approaches 151
regression testing 19, 22–23, 25, 44, 54, 55
regulation acceptance testing 52
reports
incident Fig. 5.4, Tab. 5.4, 166–167
test summary Tab. 5.3, 23, 163–164
from test tools 183
requirement specifications Figs. 2.1–2.2, 16–17, 42, 51
requirements
changes to 185
functional 50
non-functional 50
requirements management tools Fig. 6.3, Tab. 6.5, 185–186
resources, triangle of Fig. 1.2, 12
response times 202
review process support tools Tab. 6.5, 187–188
formality level Fig. 3.2, 63–64
objectives 64
roles and responsibilities 64–65, 67, 69, 70
risk-based testing 144
root cause analysis 72
safety-critical systems 81, 126, 146, 152
see also test design
Scrum 44
SDLC see software development life cycle (SDLC)
security testing 19
security testing tools Tab. 6.5, 199–200
security threat models 53
decision testing Fig. 4.14, 117–121
visual representations Fig. 4.5, Fig. 4.7, 107
sequence structures Fig. 4.4, 101–102
sequential development models see linear development models
simplified control flow graphs Figs. 4.15–4.18, 100, 121–125
site acceptance testing 51
software development life cycle (SDLC) 16–17, 37
costs of errors during Fig. 1.3, Tab. 1.1, 17
iterative models Fig. 2.3, 43–44
waterfall model Fig. 2.1, 39–40
see also test levels
software models 71
source code see code
specification-based techniques 84–86
equivalence partitioning 87–89
state transition testing Figs. 4.1–4.2, 94–98
use case testing Fig. 4.3, 98–99
specifications
functional Figs. 2.1–2.2, 42, 49–50
plain language 53
requirement Figs. 2.1–2.2, 16–17, 42, 51
test procedure 80
see also reviews
stand-up meetings 25
standard-compliant approaches 151
state tables (ST) Tab. 4.1, 95, 96–97
state transition models 53
state transition testing Figs. 4.1–4.2, 94–98
statement testing Figs. 4.10–4.13, 110–117
static analysis tools Tab. 6.5, 72–73, 188–189, 191
test tools for Tab. 6.5, 72–73, 187–189, 191
see also reviews
stochastic testing 151
strategies
see also integration strategies
stress testing tools Tab. 6.5, 202–203
structural testing 53
structure-based techniques 84, 100, 126, 127
decision testing Fig. 4.14, 117–121
simplified control flow graphs Figs. 4.15–4.18, 100, 121–125
statement testing Figs. 4.10–4.13, 110–117
system integration testing 49
system operators 150
system testing Fig. 2.2, 42, 46, 49–50, 150
technical reviews 69
technical specifications Figs. 2.1–2.2, 42
test analysts see testers
test automation 54
see also test design
test charters 127
test closure activities 24, 25
test comparators Tab. 6.5, 177, 192–193
see also test design
test control 21, 24–25, 164–165
measurement tools Tab. 6.5, 198–199
test data preparation tools Tab. 6.5, 178, 191–192
in Agile development 25
experience-based techniques 84, 85, 126–127
test development process 79–83
see also specification-based techniques; structure-based techniques
test design tools Tab. 6.5, 190–191
test-driven development 43, 46
test environments
setting up 149
and test tools 179
test execution schedules 83
test execution tools Fig. 6.3, Fig. 6.4, Tab. 6.5, 193–196
test executors see testers
test frames 190
test harnesses Fig. 6.5, Tab. 6.5, 178, 196–198
test implementation see test execution
test-level plans Fig. 5.2, Tab. 5.2, 152–155
acceptance testing Fig. 2.2, 16–17, 42, 46, 51–52, 150
integration testing Fig. 2.2, 42, 46–49, 150
system testing Fig. 2.2, 42, 46, 49–50, 150
unit testing Fig. 2.2, 42, 46, 150
test management information 178, 183, 184
test management tools Fig. 6.3, Tab. 6.5, 182–183
test managers 28, 143, 148–149
test oracles Tab. 6.5, 178, 191
test planning Fig. 2.2, 21, 42, 152–157
exit criteria 157
test plans Fig. 5.2, Tab. 5.2, 152–155
test procedure specifications 80
test process improvement 164, 166, 184
Test Process Improvement (TPI) assessment 212
test progress monitoring Figs. 5.3–5.4, 159–160
test reporting Tab. 5.3, 23, 163–164
see also test design
test summary reports Tab. 5.3, 23, 163–164
test tools 175
configuration management Fig. 6.3, Tab. 6.1, Tab. 6.5, 186–187, 188
costs Fig. 6.1, Fig. 6.4, 177, 196, 203, 214–215
coverage measurement Tab. 6.5, 198–199
data quality assessment 204
definition 177
dynamic analysis Tab. 6.5, 200–201
implementation process Fig. 6.6, 212–219
incident management Fig. 6.3, Tab. 6.5, 177, 183–185
interfaces between 179
maintenance 179
payback models Fig. 6.1, Fig. 6.4, 177, 196
performance testing Tab. 6.5, 202–203
requirements management Fig. 6.3, Tab. 6.5, 185–186
review process support Tab. 6.5, 187–188
security testing Tab. 6.5, 199–200
static analysis Tab. 6.5, 72–73, 188–189, 191
test comparators Tab. 6.5, 177, 192–193
test data preparation Tab. 6.5, 178, 191–192
test execution Fig. 6.3, Fig. 6.4, Tab. 6.5, 193–196
test harnesses Fig. 6.5, Tab. 6.5, 178, 196–198
test management Fig. 6.3, Tab. 6.5, 182–183
test oracles Tab. 6.5, 178, 191
testers
developers as Tab. 5.1, 14, 26, 43–44, 46, 145
experience-based techniques 84, 85, 126–127
independent Fig. 5.1, Tab. 5.1, 145–146
use of test tools Tab. 6.5
see also reviewers; test managers
testing
see also individual testing types
testware, traceability 148, 168, 187
time-boxes 43
timescales see test estimation
tools see test tools
top-down integration Fig. 2.4, 47–48
traceability
configuration management Tab. 6.1, 148, 168, 187
in iterative development 44
requirements changes 185
of test cases 81
transaction times 202
Unified Modeling Language (UML) 71, 189
unit test framework tools see test harnesses
unit test frameworks 46
unit testing Fig. 2.2, 42, 46, 150
usability models 53
usage patterns 202
use case testing Fig. 4.3, 98–99
user acceptance testing 51
see also test levels
validation 41
vendors, test tools 179, 213, 214–215
verification 41
walkthroughs 69
waterfall model Fig. 2.1, 39–40
white-box techniques see structure-based techniques
Wide Band Delphi approach see expert-based estimation
work-products 16, 37, 40–41, 42
see also reviews
XML (Extensible Markup Language) 179