Acceptance criteria, 201, 220, 376
Acceptance test
development-test relationship in, 286
strategies, 200
“Accidental automation,” 17
Acquisition of test tool, 11, 12, 14
ActiveX controls, 35
Actor, 436–437
Adaptability, successful test group and, 157
Add-on products, compatibility issues with, 129–130
“Advanced automation,” 18
Advocates, automated test, 547
After-hours testing, 48–49
Alexander, Christopher, 406
Alpha testing, 254
Analysis, as verification method, 202, 203
Analysis and design, 11. See also Test process analysis; Test program design
Analysis and design, 13, 14, 233
automated tools supporting, 43, 418, 437–442
evaluation and selection of, 77, 82–83
structure charts, flowcharts, and sequence diagrams, 123, 440–441
test procedure generators, 441–442
visual modeling tools, 437–440
automated vs. manual, 257, 262–266, 545
baseline, 308
defect detection and, 132
documentation of, 128, 275–277
walkthroughs and inspections of, 124
test activities during, 123
WBS for, 161
Application execution, synchronization with test procedure execution, 323
Application partitioning, 129
Applications
designing testability into, 128–130
mission-critical, 171–172
navigation in, 312–313
tool-incompatible features, 136
Application-under-test (AUT), 464
program logic of, 263
requirements of, 50
Architecture, test development, 288–306, 308
automation reuse analysis, 291–292
building blocks of, 289
compatibility work-around solutions, 302–303
documentation of, 196
environment readiness checks, 291
manual execution of test procedures, 304
modularity-relationship analysis, 293, 295–301
peer reviews, 304–305
technical environment, 288–291
test procedure configuration management, 305–306
test procedure development/execution schedule, 287, 292–295, 296–297
test tool calibration, 302
Assessment. See Test program review and assessment
ATLM, 7–22. See also Analysis and design; Business analysis, automated tools supporting; Programming, automated tools supporting; Requirements definition; Testing
decision to automate test, 10, 14
execution and management phase, 11, 14–15
introduction of automated testing, 11, 12–13
system development life cycle and, 14–15
test automation development and, 19–21
test effort and, 21–22
Test Maturity Model (TMM) and, 15–19
test planning, 11, 13, 14, 192, 193
test tool acquisition, 11, 12, 14
AUT. See Application-under-test (AUT)
Automated testing, 3–27
ATLM. See ATLM
background on, 5–7
careers in, 22–26
universal application of, 35–36
unrealistic expectations of, 30, 32–37
Automated test life-cycle methodology. See ATLM
Automation framework. See Infrastructure, automation
AutoScriptor Inferno, 464–465
Back-end processing, 137
Backup and recoverability testing, 252
Baselined system setup and configuration, 543
Behavioral approach (system-level) to test requirements analysis, 204, 226, 228–232
matrix for, 231–232
Beizer, Boris, 130
Benefits analysis (of automated testing), 37–54
case study of, 52–54
improved test effort quality, 43–49
production of reliable system, 38–43
reduction of test effort and schedule minimization, 49–51
Best practices, 539–549
automated test advocates and experts, 547
automated vs. manual test analysis, 545
baselined system setup and configuration, 543
beta testing, 548
customer involvement, 546–547
defect documentation and reporting, 547
documenting process, 539–541
managing expectations, 541
overall test program objectives, 543–544
reuse analysis, 545
schedule compatibility, 546
simple automation, 544
software installations in test environment baseline, 543
specialty tools expertise, 548–549
test procedure design and development standards, 544–545
test team assignments, 547–548
test team, communication with other teams, 545–546
test tool
compatibility checks, 541–542
improvement suggestions, 548
upgrades, 542–543
user group participation, 548
using pilot project, 541
Bitmap image recording, 310, 313–315
Black-box testing, 118, 234, 244–254, 372–377
applications of, 244
automated tools supporting, 254
boundary value analysis, 246
cause/effect graphing, 246
coverage metric, 372–374
equivalence partitioning, 245–246
overview of, 238
progress metric, 374–375
quality metric, 375–376
system testing, 247–254
alpha/beta, 254
backup and recoverability, 252
configuration, 252–253
conversion, 252
data integrity, 252
functional, 247–248
operational readiness, 253
performance, 250
random, 250–252
regression, 248
security, 248
stress, 248–250
user acceptance, 253
test automation, 376–377
test data definition for, 279–282
user participation in, 245
Boundary, test program, 201
Boundary-interior path testing, 243
Boundary value analysis, 246
Branch coverage, 243
Branching constructs, 329–331
Breadth of test data values, 279
Budget, tool evaluation and selection and, 74. See also Cost(s)
Bugs, show-stopper, 44
Builds
incremental model of, 200
new process, 360–366
test effort sizing estimates and, 171
Build verification testing (smoke test), 43–45, 342–343
Business analysis, automated tools supporting, 418, 421–433
business modeling tools, 76, 79, 421–424
configuration management tools, 424–426
defect tracking tools, 426–432
documentation generators, 433
evaluation and selection of, 76, 79–80
technical review management tools, 432–433
Business analyst, roles and responsibilities of, 186
Business expertise, test effort improvement and, 48
Business function, black-box test grouping and, 247
Business knowledge
successful test group and, 157
test effort sizing estimates and, 171
Business modeling tools, 76, 79, 421–424
Business process reengineering (BPR), 230
Business/product management stage of career, 487
Calibration of test tools, 302
Calls to procedure, 336
Capability Maturity Model (CMM), 111
Test Maturity Model and, 16–19
Capture/playback (GUI testing) tools, 6–7, 77
bitmap testing using, 314–315
test development activities using, 307
Careers in software testing, 22–26, 475
development program for test engineers, 476–487
business/product management stage, 487
team effort stage, 483–484
technical skills stage, 478–480
technical stewardship stage, 484–485
test process stage, 480–483
test/project management stage, 485–487
case statements, 330
Case (test procedure) generators, 77, 82–83, 282, 441–442
Cause/effect graphing, 246
Census (defect tracking tool), 430–432
Centralized test team, 150, 151–153, 155–156, 178
Certification method, 202, 203
Chief user liaison, roles and responsibilities of, 186
ClearCase, 425–426
Client-server environment, 5–7, 21
automated testing in, 4
Code
analysis of, 371–372
inspection of, 126–127
preambles or comments in front of, 129
spaghetti, 330
testing, 130
walkthrough, 126
Code checkers, 77, 84–85, 88–89, 444–445
CodeCheck (software), 85, 444–445
Code (test) coverage analyzers and code instrumentors, 77, 85, 450–456
EZCover, 451–452
PureCoverage, 455–456
STW/C, 452–455
cosmetic, 317–319
Comments
in code, 129
in test scripts, 319–320
Commercial-off-the-shelf (COTS) tool, 4
test objectives for, 117–118
Communication
skills in, 483
between test team and other teams, 545–546
Comparator, 87
Compatibility
with application features, 136
with custom controls, widgets, or third-party add-on products, 129–130
multiplatform, 46
with operating environment, 136
schedule, 546
work-around solutions for, 302–303
Compatibility checks, 134–135, 136, 139–140, 541–542
for GUI test tools, 35
Complexity analysis, 73
Compliance of test process, 114
Computer Select, 92
Concurrency tests, 249
Concurrent users, license agreement for, 101
Condition coverage, 243
Configuration, baselined, 543
Configuration management
in testbed management, 306
of test procedure, 305–306
Configuration management (CM) tools, 76, 79, 115, 305–306
for business analysis, 424–426
Configuration testing, 46, 252–253
with PerformanceStudio, 468–469
Conscious requirements, 410
Consideration. See Test tool consideration
Constants, in test procedures, 333
Constraints, examination of, 122
Context independence, 287, 331
Continuous improvement in test process, 114–115
Continuous loop, 328
Controls, Windows, 35
Conversion, data, 120
Conversion testing, 252
Correction monitoring, 51
Corrective actions, 381–392
for environment-related problems, 391
for schedule-related problems, 384–385
for test program/process-related problems, 386–388
for tool-related problems, 388–390
Cosmetic coding standards, 317–319
Cost(s)
of correcting defects, 7–8
as limiting factor, 36
of tool, estimate of, 60–61
of tool introduction, 55
training, 62–63
Counted loop, 328
Coverage, 372
data flow, 243
one hundred percent, 36–37
decision, 47
Coverage metric, 372–374
Covey, Steve, 147
Critical/high-risk functions, 195, 199, 204
.csv file, 342
Current quality ratio metric, 373
Custom controls, compatibility issues with, 129–130
Customer involvement, 546–547
in test plan, 215
Cyclomatic complexity, 47
Database
test tool, 269
version upgrade for, 120
Database activity, test tool consideration and, 136
Database reset script, 328
Data conversion, 120
Data diagrams, 422
Data-driven tasks, automation of, 265
Data flow coverage, 243
Data integrity
during test execution, 280
Data requirements, 196
Data values, hard-coded, 310–311
Data verification, 71
Deadlines, missed, 3
Decision coverage, 47, 227, 242–243
Decision to automate, 10, 14, 29–66
benefits analysis, 37–54
case study of, 52–54
improved test effort quality, 43–49
production of reliable system, 38–43
reduction of test effort and schedule minimization, 49–51
false expectations and, 32–37
management support acquisition in, 54–64
test tool proposal, 56–64
WBS for, 158
Defect(s). See also Software Problem Report
classification of, 360
cost of correcting, 7–8
adherence to test process, 131
automated test tools for, 130, 132
designing testability into application, 128–130
inspections and walkthroughs, 126–127
risk assessment, 131
strategic manual and automated test design, 132
testing product deliverables, 127–128
test verification, 132
traditional testing phases, 130–131
user involvement in, 132–133
documentation and reporting of, 361, 547
life-cycle model of, 365–366, 430–431
number detected and undetected, 220
prevention of, 120–126
examination of constraints, 122
inspections and walkthroughs, 124–125
with quality gates, 125–126
test team involvement in, 122–123
in TMM, 19
use of standards in, 123–124
report on, 350
reproduction of, 48
Defect aging metric, 373, 374–375
Defect density metric, 373, 375
Defect fix retest metric, 373, 375
Defect tracking tools, 76, 79–86, 361–366
for business analysis, 426–432
Census, 430–432
PVCS Tracker, 426–428
TestTrack, 428–430
identification of, 197
Defect trend analysis metric, 373, 375, 376
Deliverables, testing, 127–128
Deming, W. Edwards, 379
Demonstration(s)
of test tool, 64
as verification method, 202, 203
Dependencies
event or activity, 200
test procedure, 293
Depth of test data database, 279
Depth of testing, 371
Description, task, 148
Design. See Analysis and design; Test program design
Design-based test architecture, 234–236
Design-based test requirements analysis, 227
Design complexity metrics, 372
Design documentation, 128, 275–277
walkthroughs and inspections of, 124
Design Editor, 422–423
Designer/2000 (Oracle), 421–424
Design-to-development transition, 308–309
Development. See System development life cycle; Test development
Developmental-level (structural approach) test requirements analysis, 204. See also White-box testing
Developmental-level (structural approach) test requirements analysis, 226–228, 229
Development ratio method, 165–166
Development team, relationship with, 42–43
successful test group and, 157
Dictionary, test, 324
Discipline for Software Engineering, A (Humphrey), 483–484
Discover (analyzer), 446–447
DiscoverY2K, 471–472
Discrete Mathematics and Its Application (Rosen), 36
DO-178B, 227
Documentation, 539–541. See also Report(s)
walkthroughs and inspections of, 124
project, review of, 212–213
of requirements specifications, 128
of test development architecture, 196
of test procedure definition, 196
of test process analysis, 111, 112, 113, 118
of test program design, 255–256
of test requirements, 205–206
of test scripts, 320–321
of unit testing, 353
Documentation generation tools, 76, 80, 433
DOORS (Dynamic Object-Oriented Requirements System), 206–207, 305, 435–436
dxl language, 207
Dynamic analyzers. See Static and dynamic analyzers
Dynamic testing, 223–224
Early test involvement, 7–9
Earned value management system (EVMS), 366–370
Effort. See Test effort
Electronic test program library, 339
Employee referral programs, 179
End-users. See also Users
buy-in to test plan, 215
input on tool evaluation and selection, 71–72
Engineers. See Test engineer(s)
Environment. See also Test environment
corrective actions for problems related to, 391
systems engineering, 70–76
technical, 288–291
Environment readiness checks, 291
Environment testing tools, 77, 88
Equivalence partitioning, 245–246
ERB, 361
ErgoLight, 456–457
Erroneous input testing, 239–240
Error discovery rate metric, 373, 374
Error handling, 240
in test procedures, 324–325
Error-logging routine, 343
Error status/correction monitoring, 51
Evaluation and selection, 67–103
domain of, 63–64
definition of, 96–98
hands-on evaluation, 98–102
license agreement, 101–102
report, 99–100
process of, 67–70
systems engineering environment and, 70–76
budget constraints, 74
help desk problem reports, 74
long-term investment considerations, 75
shortcut avoidance, 75–76
software quality expected, 73–74
test tool process, 75
test types, 74–75
third-party input from management, staff, and end users, 71–72
tool criteria reflecting, 72–73
test planning and, 194
test tool research, 89–95
GUI testing (record/playback) tool, 90–95
of tools supporting testing life cycle, 76–89
analysis and design phase, 77, 82–83
business analysis phase, 76, 79–80
Year 2000 (Y2K) testing tools, 77, 88
WBS for, 158–159
Evaluation copy of test tool, 98
Event-driven environment, 6
Events, test program, 211–212
Event/use case. See Use cases
EXDIFF, 458–459
Execution and management phase, 11. See also Test execution
Execution and management phase, 14–15
Exit function, 341
Expectations
decision to automate and, 32–37
evaluation and selection of tools and, 72, 73–74
managing, 541
Experts, automated test, 547
Exposure, test requirements prioritization and, 209
EZCover, 451–452
False expectations, 32-37
False negatives, 356-357
False positives, 357-358
Fault density, 372
Fault insertion, 239-240
File(s)
checking for existence of, 330
global, 332-333
File comparison tools, 77, 86, 458-459
Flexibility/adaptability, successful test group and, 157
Flowcharts, 440-441
FOR loop, 328-329
Framework, automation. See Infrastructure, automation
Functional test coverage metric, 374
Functional testing, 247-248
Functional threads, 230
Gartner Group, 92
Global files, 332–333
Goals
decision to automate and, 263
of test process analysis, 116–120
Goethe, Johann Wolfgang von, 347
GOTO statements, 330
Graphical user interface. See GUI
Graphing, cause/effect, 246
Grouping tests, 294
GUI applications
inspection of screens, 202
iterative development of, 291
testing, 137
testing tools (record/playback), 77, 87, 462–465
AutoScriptor Inferno, 464–465
compatibility tests for, 35
Rational Suite Test Studio, 462–464
research on, 89–95
selection for pilot project, 97
GUI checks, 344
GUI map, 338–339
GUI standards, verification of, 342
Hands-on tool evaluation, 98–102
license agreement, 101–102
report, 99–100
Hard-coded data values, 310–311
Hardware configuration, supporting test environment, 290
Headers, test procedure, 321–322
Help desk problem reports, 74
Help function verification script, 343–344
High-risk functions, 195, 199, 204
Hindsight (code analyzer), 85, 451
Historical percentage factor, 168, 169
Hotkeys (accelerator keys), 312
Hungarian Notation, 326
Hybrid test architecture, 234
if..then..else statements, 330
Improvement
for environment-related problems, 391
process, 382
continuous, 114–115
successful test group and, 157
for schedule-related problems, 384–385
for test program/process-related problems, 386–388
tool proposal and opportunities for, 57–58, 59
for tool-related problems, 388–390
“Incidental automation,” 17
Incremental build model, 200
Independence
of test procedure, 268
Independent verification and validation (IV&V) test team, 150–151, 153–154, 156
Index
test procedure, 324
test success, 375
I-NET interface, 426–427
Information Model, 446
Infrastructure, automation, 287–288, 336–344
advanced math functions, 344
automated recording options, 340–341
error-logging routine, 343
exit function, 341
help function verification script, 343–344
login function, 341
navigation, 341
PC environment automated setup script, 339–340
smoke test, 342–343
table-driven test automation, 337–339
timed message boxes function, 344
verifying GUI standards, 342
Inspection(s)
defect detection through, 126–127
defect prevention through, 124–125
as verification method, 202, 203
Integration
development-test relationship in, 286
in execution and evaluation phases, 354–356
in TMM, 18
Integration and test phase, 15
Integration testing, 4, 131, 241
execution and evaluation of, 354–356
Integrity of automated test process, safeguarding, 115–116
Integrity testing, data, 120, 252
“Intentional automation,” 17
Interactive development environments (IDE), 83
Interviews, test engineer, 179–181
Introduction of automated testing, 11, 12–13, 107–145
cost of, 55
management support of, 75
process of, 107–110
test process analysis, 12, 110–133
documentation of, 111, 112, 113, 118
goals and objectives of, 116–120
organization’s quality and process improvement standards and, 110–111
process review, 112–116
purpose of, 111
strategies, 120–133
test planning and, 194
test tool consideration, 12–13, 133–143
application-under-test overview, 134, 136, 137
compatibility check, 134–135, 136, 139–140
demonstration to project team, 134, 140–141
project schedule review, 134, 138–139
support profile, 141–143
system requirements review, 133–134, 135–137
training requirements, 134, 135, 143
time required for, 61
WBS for, 159
Investment considerations, tool evaluation and selection and, 75
Iterative development approach, 291, 294
Job requisition, 176–177
Jones, Effie, 191
Junior test engineer, 25
Kuhn, Thomas, 105
Leadership skills, 484–485
Lessons learned. See Corrective actions
Lessons learned review activities, 81
Library
electronic test program, 339
reuse, 341
advanced math functions, 344
error-logging routine, 343
help function verification script, 343–344
smoke test, 342–343
timed message boxes function, 344
verifying GUI standards, 342
License agreement, 101–102
Life cycle. See also ATLM
system development, 14–15
early involvement by test team in, 157
LINT, 84
Load/performance/stress testing tools, 77, 87
Load/performance testing tools, 77, 465–469
Load testing
with PerformanceStudio, 467
of server, 137
Load-testing tools, 39
Logic Editor, 422
Logic flow, 295
Login function, 341
Long-term investment, tool evaluation and selection and, 75
Looping constructs, 20, 328–329
Maintainability
of test procedures, 268, 309, 317–333
branching constructs, 329–331
constants, 333
context independence, 331–332
cosmetic coding standards, 317–319
error handling, 324–325
global files, 332–333
headers, 321–322
index, 324
looping constructs, 328–329
modularity of scripts, 327–328
naming standards for scripts, 325–327
synchronization, 323
test script comments, 319–320
test script documentation, 320–321
of test scripts, 20–21
Maintenance agreements, 101
Management. See also Test manager; Test team management
decision to automate and, 54–64
test tool proposal, 56–64
introduction of tool and, 75
tool evaluation and selection and input from, 71–72
Man-hours estimation, 168, 169
Manual test engineer, roles and responsibilities of, 185
automated test tools as enhancements of, 33
defect detection and, 132
issues not amenable to, 47–48
Maslow, Abraham, 67
Math functions, advanced, 344
Maturity levels, test effort sizing and, 163–164
Memory leak testing, 73, 240–241
Mentoring, 42
Message boxes function, timed, 344
acceptance criteria, 201, 220, 376
collection and analysis of, 370–371
evaluation of, 380
successful test group and, 157
Metrics tools, 77, 419, 450–457
code (test) coverage analyzers and code instrumentors, 77, 85, 450–456
EZCover, 451–452
PureCoverage, 455–456
STW/C, 452–455
evaluation and selection of, 77, 85–86
usability measurement tools, 456–457
Micrografx FlowCharter 7, 440
Mission-critical applications, 171–172
Mission statements, product or project, 128
Modeling tools
business, 421–424
visual, 437–440
Moderator, in code inspection, 127
Modified condition decision coverage (MC/DC), 227
Modularity
of test procedure, 268
Modularity-relationship analysis, 196, 293, 295–301
matrix for, 298–301
Module diagrams, 422
Module stress testing, 249
Module Test Environment, 451
Module testing (string testing), 241–242
Module (unit) development, development-test relationship in, 286
Monkey (random) testing, 250–252, 332
Mouse clicks, navigation using, 312, 313
MTE product, 449
Multiplatform compatibility, 46
Multiuser testing, with PerformanceStudio, 467–468
Named users, license agreement for, 101
Naming convention(s)
for program library, 339
for scripts, 325–327
for test procedures, 261, 269–271, 272
for variables, 326
Navigation, application, 312–313, 341
NETClarity, 460–462
Network Checker+, 461–462
Network test engineer, roles and responsibilities of, 185
Network testing tools, 77, 86–87, 460–462
North Seattle Community College, 26
Objectives
overall, 543–544
of test process analysis, 116–120
Object mode, 318
OCXs, 35
Office politics, 486–487
Operating environment, tool-incompatible, 136
Operational characteristics, test requirements prioritization and, 209
Operational readiness testing, 253
OPNET, 459–460
Optimization
of regression test, 41–42
in TMM, 19
Organization(s). See also Management
quality and process improvement standards of, 110–111
standards of, 149
test effort sizing estimates and, 170
of test team. See under Test team
training, 62
Parsing program, 337
Partitioning
application, 129
equivalence, 245–246
Password, verification of user, 36–37
Path coverage, 243
PC environment automated setup script, 339–340
Peer reviews, 304–305
code inspection, 127
Percentage method, 166–167
Performance monitoring tools, 137
Performance requirements, 71
Performance testing, 39, 120, 250
Personal software process (PSP), 483–484
Person-hours estimation, 168, 169
Phase definition in TMM, 17
selection guidelines, 97
Planning. See Test plan
Politics, office, 486–487
Preliminary test schedule, 197
Presentations, test tool, 64
Printout, verifying, 35
Priorities
in test procedure development/execution schedule, 294
of tests, 208–209
Problem reports metric, 373
Process analysis. See Test process analysis
Process brief program, 480–481
Process definition, test effort sizing estimates and, 171
Process evaluation phase, 14
Process improvement, 382
successful test group and, 157
Product assurance (PA) team, 166
Product-based test procedure generators, 88–89
Product management stage of career development, 487
Program manager, 25
Programmer, in code inspection, 127
Programmer analyst, 25
Programming phase, automated test tools supporting, 43, 77, 419, 442–449
evaluation and selection of, 77, 83–85
memory leak and runtime error detection tools, 442–444
source code testing tools, 444–445
static and dynamic analyzers, 445–449
syntax checkers/debuggers, 442
unit testing tools, 449
Progress metric, 374–375
Project documentation, review of, 212–213
Project management stage of career development, 485–487
Project mission statements, 128
Project plans, 214
Project team, demonstration of tools to, 134, 140–141
Proposal for test tool, 56–64
components of, 56–57
cost estimate, 60–61
estimated improvement opportunities and, 57–58, 59
evaluation domain, 63–64
rollout process, 64
selection criteria, 58–60
time required for introduction, 61
tool expertise, 61–62
training cost, 62–63
Prototype, 58
Prototyping tools, 77
Purchase list, test equipment, 215, 216
PureCoverage, 455–456
Purify (development test tool), 353
PVCS Tracker, 426–428
Qualification methods, 202–203
Quality assurance (QA) department, 381
Quality control in TMM, 19
Quality gateways, 125–126, 406
Quality guidelines, 220
Quality measurements/metrics, 41–42, 375–376
Quality of fixes metric, 373
Quantify (development test tool), 353
Random (monkey) testing, 250–252, 332
Rational Performance Studio, 137, 465–469
Rational Purify, 442–444
Rational Robot, 87, 340–341, 462–463
Rational Rose, 424, 436, 438–440
Rational Suite Test Studio, 269, 334, 462–464
Rational TeamTest, 137, 321, 323
Rational TeamTest SQA Manager, 86
Recording
automated, 340–341
Record/playback (GUI testing) tools, 77, 87, 462–465
AutoScriptor Inferno, 464–465
compatibility tests for, 35
Rational Suite Test Studio, 462–464
research on, 89–95
selection for pilot project, 97
Recruitment of test engineers, 172–182
activities for, 178
distinguishing best candidate, 181–182
interviews, 179–181
job requisition, 176–177
locating candidates, 178–179
qualities and skills desired, 172–174
target composition of team and, 174–176
Regression testing, 4, 120, 248
optimization of, 41–42
scripts during, 45–46
test effort improvement through, 45–46
test results analysis of, 358–359
Reliability of system, decision to automate and, 38–43
Repeatability of test process, 113–114
Repetitive tasks, automation of, 264–265
Report(s). See also Documentation
creation of, 51
on defects, 547
evaluation, 99–100
help desk problem, 74
software problem (SPR), 350, 360
categories of, 364
priority of, 376
black-box testing metrics, 372–377
case study, 367–370
earned value management system (EVMS), 366–370
test metric collection and analysis, 370–371
white-box testing metrics, 371–372
Representative sample, 246
Requirement management tools, 305
Requirements, 405–416
analysis of, 23–32
developmental-level (structural approach), 226–228, 229
process of, 224–225
system-level (behavioral approach), 204, 226, 228–232
of application-under-test (AUT), 50
based on risk, 265–266
coherency and consistency of, 408–409
completeness of, 409–410
conscious, 410
documentation of, 128
grouping, 414–415
management of, 194, 195–196, 205–211
prioritization, 208–209
requirements traceability matrix, 195, 202, 204–205, 209–211, 224
risk assessment, 208
tools for, 77, 80–81, 206–208, 434–436
measurability of, 406
nonquantifiable, 407–408
performance, 71
quality gateway for, 406
quantifiable, 406–407
relevance of, 411–412
solution vs., 412
stakeholder value placed on, 413
test effort sizing estimates and, 170
traceability of, 413–414
tracking knowledge about, 407–408
unconscious, 410
undreamed-of, 410
walkthroughs and inspections of, 124
Requirements definition, 195, 203–205
automated tools supporting, 43, 77, 418, 434–437
evaluation and selection of, 77, 80–82
requirements management tools, 77, 80–81, 206–208, 434–436
requirements verifiers, 77, 81–82, 436
use case generators, 436–437
improved, 38–39
Requirements phase, 14
test team involvement in, 122–123
RequisitePro, 434
Research on test tools, 89–95
GUI testing (record/playback) tool, 90–95
Reset script, database, 328
Resource availability, test requirements prioritization and, 209
Resource management, successful test group and, 157
Responsibilities and roles of test team, 182–186, 195
Résumés, soliciting, 179
Return on investment of automated tools, 58, 392–401
Reusability
analysis of, 291–292
of automated modules, 264
of test scripts, 113
Reusable test procedures, 287, 301, 309, 310–317
application navigation, 312–313
automation wildcards, 315–316
bitmap image recording, 310, 313–315
capture/playback, 310, 316–317
data, 310–312
Reuse library, 341
advanced math functions, 344
error-logging routine, 343
help function verification script, 343–344
smoke test, 342–343
timed message boxes function, 344
verifying GUI standards, 342
Review. See Test program review and assessment
ReviewPro, 432–433
Revision Labs, 472
Risk(s)
assessment of
defect detection through, 131
of test requirements, 208
mitigation strategy, 201–202
requirements based on, 265–266
in test procedure development/execution schedule, 294
in test program, 201–202
test requirements prioritization and, 209
Roles and responsibilities of test team, 182–186, 195
Rollout, tool, 64
Root-cause analysis, 115
Rosen, Kenneth H., 36
Runtime error detection tools, 77, 84, 442–444
Sample, representative, 246
.sbh extensions, 332
.sbl (Visual Basic) files, 332
Schedule (Test)
corrective actions to problems related to, 384–385
preliminary test, 197
pressures of, 130
reduction in, 34
error status/correction monitoring and, 51
report creation and, 51
test effort reduction and, 49
test execution and, 50–51
test result analysis and, 51
test effort sizing estimates and, 172
test procedure development/execution, 196, 287, 292–295, 296–297
tool introduction and, 61
tool selection and, 75–76
Schedule compatibility, 546
Scope
critical/high-risk functions, 195, 199, 204, 230, 231
system description, 195, 198–199
test effort sizing estimates and, 171
test goals, objectives, and strategies, 199, 200
test program parameters, 200–202
test requirements definition, 195, 203–205
test tools, 199–200
verification/qualification methods, 202–203
of test data values, 279–280
Screen capture and playback. See Capture/playback (GUI testing) tools
Screen shot recordings, See Bitmap image recording
Script(s), automated, 141
baselining, 196–197
comments in, 319–320
database reset, 328
development of, 164
documentation of, 320–321
help function verification, 343–344
in integration testing, 241
login, 341
naming conventions for, 325–327
PC environment automated setup, 339–340
during regression testing, 45–46
reusable, 113
test environment setup, 340
test procedure, 305–306
test utilities, 137
WinRunner, 343–344
Scripting language, 268–269
Security testing, 248
Selection. See Evaluation and selection
Senior test engineer, 25
Sensitivity analysis, 246
Sequence, test procedure, 293
Sequence diagrams, 440–441
Server load test tool, 137
Shell procedures, 299–301
Shortcut avoidance, tool evaluation and selection and, 75–76
Show-stopper bug, 44
Sight verify function, 299
Silk Tools, 470–471
Simulation tools, 77, 86, 459–460
Sizing test program, 483
Smoke test (build verification testing), 43–45, 342–343
SoDA (Software Documentation Automation), 433
Software configuration testing, 46
Software developer, 20
Software development folder (SDF), 353
Software development skills, test team members with, 183
Software engineers, career opportunities for, 475
Software problem report (SPR), 350, 360
categories of, 364
priority of, 376
Software Quality Engineering, 472
Software releases, incremental, 3
Software System Testing and Quality Assurance (Beizer), 130
Software test engineer. See Test engineer(s)
Source code testing tools, 77. See also Code
Source code testing tools, 84–85, 88–89, 444–445
Spaghetti code, 330
Specialty tools expertise, 548–549
SPR. See Software problem report (SPR)
Staff input on tool evaluation and selection, 71–72
Standard Dictionary of Measures to Produce Reliable Software, 47
Standards
coding, 129
cosmetic coding, 317–319
defect prevention through, 123–124
design and development, for test procedure, 544–545
GUI, verification of, 342
organization-level, 149
project-specific, 148–149
test procedure, 287
for automated designs, 257, 266–271
for manual designs, 272–274
Start-up activities
centralized test team in, 152
SMT (System Methodology and Test) personnel in, 154
WBS for, 158
Statement coverage technique (C1), 227, 242
Static and dynamic analyzers, 77, 85, 445–447
Discover, 446–447
during unit testing, 353
Static test strategies, 234
black-box testing metrics, 372–377
coverage, 372–374
progress, 374–375
quality, 375–376
test automation, 376–377
case study, 367–370
earned value management system (EVMS), 366–370
test metric collection and analysis, 370–371
white-box testing metrics, 371–372
Stovepipe test team, 150, 151, 155, 178
StP/T tool, 83
for introducing test tools, 120–133
static, 234
Stress testing, 39–41, 241–242, 248–250
test environment design and, 290
PerformanceStudio, 467
String testing (module testing), 241–242
Structural approach (developmental-level) to test requirements analysis, 204. See also White-box testing
Structural approach (developmental-level) to test requirements analysis, 226–228, 229
Structure charts, 123, 440–441
Stub routine generation tools, 77
STW/C, 452–455
Support, WBS for project, 158, 162
Support profile for test tool, 141–143
Surveys to collect test program benefits, 398–401
Synchronization, 323
Syntax checkers/debuggers, 77, 84, 353, 442
System coverage analysis, 372–374
System description, 195, 198–199
System design and development phase, 14
testability in, 129
System development life cycle, 14–15
automated testing late in, 138
early involvement by test team in, 157
improved, 43
System-level (behavioral approach) test requirements analysis, 204, 226, 228–232
matrix for, 231–232
System requirements. See Requirements
Systems engineering environment, tool evaluation and selection and, 70–76
budget constraints, 74
help desk problem reports, 74
long-term investment considerations, 75
shortcut avoidance, 75–76
software quality expected, 73–74
test tool process, 75
test types, 74–75
third-party input from management, staff, and end users, 71–72
tool criteria reflecting, 72–73
Systems engineering support (SES) center, 153
System setup
baselined, 543
Systems methodology and test (SMT) test team, 150, 151, 154–155, 156, 178
System stress testing, 249
alpha/beta, 254
automated tools supporting, 420, 460–472
evaluation and selection of, 77, 86
GUI application testing tools, 462–465
load/performance testing tools, 465–469
network testing tools, 460–462
test management tools, 460
Web testing tools, 470–471
Y2K testing tools, 421, 471–472
backup and recoverability, 252
configuration, 252–253
conversion, 252
data integrity, 252
development-test relationship in, 286
execution and evaluation, 356–358
false negatives, 356–357
false positives, 357–358
functional, 247–248
operational readiness, 253
performance, 250
random, 250–252
regression, 248
security, 248
stress, 248–250
support tools for, 419–420, 457–460
file comparison tools, 77, 86, 458–459
simulation tools, 77, 86, 459–460
test data generators, 77, 86, 281–282, 457–458
user acceptance, 253
System test level, 226
System under test (SUT), 464
Table-driven test automation, 337–339
Tabs, navigation using, 312, 313
Task description, 148
Task planning method, 168–170, 194
Task separation, successful test group and, 157
Team, test. See Test team
Team effort stage of career development, 483–484
Team manager, 142
Technical environment, 288–291
Technical knowledge, successful test group and, 157
Technical review management tools, 76, 80, 432–433
Technical skills stage of career development, 478–480
Technical stewardship stage of career development, 484–485
Technical training courses, 478, 479
Technique-based test architecture, 236
Telephone system, test of, 37
Template, test procedure, 269, 275, 321
Termination of license agreement, 101
Test analysis. See Analysis and design
Test and integration work group (TIWG), 212
Test architecture, review of, 257, 258
Test automation development, 19–21
Test automation metric, 376–377
TestBytes, 457–458
Test completion/acceptance criteria, 201, 220, 376
Test coverage, 372
one hundred percent, 36–37
Test data, test environment design and, 290
Test data definition
for black-box testing, 279–282
for white-box testing, 277–278
Test data generators, 77, 86, 281–282, 457–458
Test data requirements, 257, 277–282
Test design. See Analysis and design
Test development, 11, 13, 14, 285–346
automation reuse analysis, 291–292
building blocks of, 289
compatibility work-around solutions, 302–303
documentation of, 196
environment readiness checks, 291
manual execution of test procedures, 304
modularity-relationship analysis, 293, 295–301
peer reviews, 304–305
technical environment, 288–291
test procedure configuration management, 305–306
test procedure development/execution schedule, 287, 292–295, 296–297
test tool calibration, 302
automation infrastructure, 287–288, 336–344
advanced math functions, 344
automated recording options, 340–341
error-logging routine, 343
exit function, 341
help function verification script, 343–344
login function, 341
navigation, 341
PC environment automated setup script, 339–340
smoke test, 342–343
table-driven test automation, 337–339
timed message boxes function, 344
verifying GUI standards, 342
guidelines, 306–336
API calls and .dll files, 309, 335–336
design-to-development transition, 308–309
maintainable test procedure, 309, 317–333
reusable test procedures, 287, 301, 309, 310–317
test procedures/verification points, 309, 334
user-defined verification methods, 309, 334–335
preparation activities for, 286–287
test process and, 285–287
WBS for, 161–162
Test dictionary, 324
Test drivers, non-GUI, 77
Test effectiveness, 373
Test effectiveness metric, 376
Test effort
ATLM and, 21–22
improving quality of, 43–49
after-hours testing, 48–49
build verification testing (smoke test), 43–45, 342–343
business expertise, 48
execution of mundane tests, 46–47
focus on advanced test issues, 47
multiplatform compatibility testing, 46
regression testing, 45–46
software configuration testing, 46
software defect reproduction, 48
sizing, 163–172
development ratio method, 165–166
factors in, 170–172
maturity levels and, 163–164
percentage method, 166–167
task planning method, 168–170, 194
test procedure method, 167–168
start of, 171
TMM and, 163–164
Test engineer(s), 142
career development program for, 476–487
business/product management stage, 487
team effort stage, 483–484
technical skills stage, 478–480
technical stewardship stage, 484–485
test process stage, 480–483
test/project management stage, 485–487
career path for, 476
mission of, 20
recruitment of, 172–182
activities for, 178
distinguishing best candidate, 181–182
interviews, 179–181
job requisition, 176–177
locating candidates, 178–179
qualities and skills desired, 172–174
target composition of team and, 174–176
requirements for, 23
roles and responsibilities of, 185
test effort sizing estimates and skill level of, 170
test program status tracking and, 366
Test environment, 13, 194, 214–217
integration and setup of, 216–217
preparations for, 214–216
software installations in, 543
Test environment setup script, 340
Test environment specialist, roles and responsibilities of, 186
Test equipment purchase list, 215, 216
Test execution, 11, 14–15, 50–51, 197, 349–378
data integrity during, 280
defect tracking and new build process, 360–366
multiple iterations of, 294
phases of execution and evaluation, 351–360
integration, 354–356
system, 356–358
test results analysis of regression tests, 358–359
unit, 351–354
user acceptance, 359–360
playback problems, 339
playback speed of tool, 302
status measurement, 374
black-box testing metrics, 372–377
case study, 367–370
earned value management system (EVMS), 366–370
test metric collection and analysis, 370–371
white-box testing metrics, 371–372
WBS for, 162
Test goals, decision to automate and, 263
Testing. See System testing
Testing Maturity Model (TMM), 122
Test lab, 96
Test lead
in code inspection, 127
roles and responsibilities of, 184
Test library and configuration specialist, roles and responsibilities of, 186
Test management and support, WBS for, 162
Test management tools, 77, 86, 460
Test manager, 485–487
lessons learned records and, 381
primary concern for, 486
roles and responsibilities of, 183
test program status tracking and, 366
TestManager, 462
Test Maturity Model (TMM), 15–19
Capability Maturity Model (CMM) and, 16–19
test effort and, 163–164
Test methodology, successful test group and, 157
Test metrics. See Metric(s); Metrics tools
Test optimization, 41–42
Test plan, 11, 13, 14, 191–222
activities in, 192–197, 212–213
automatic generation of, 32–33
baselining test scripts in, 196–197
data requirement, 196
development of, 49–50
documentation of tests, 195, 196, 213
end-user or customer buy-in to, 215
integration of setup of, 216–217
preparations for, 214–216
evaluation of, 96
events, 211–212
IV&V group responsibility regarding, 153
modularity relationship analysis in, 196
outline of, 218–220
program scope, 194, 195, 197–205
critical/high-risk functions, 195, 199, 204, 230, 231
system description, 195, 198–199
test effort sizing estimates and, 171
test goals, objectives, and strategies, 199, 200
test program parameters, 200–202
test requirements definition, 195, 203–205
test tools, 199–200
verification/qualification methods, 202–203
project plan, 214
purpose of, 217–218
qualities and skills required, 194–195
requirements management, 194, 195–196, 205–211
prioritization of tests, 208–209
requirements traceability matrix, 195, 202, 204–205, 209–211, 224
risk assessment, 208
tools for, 77, 80–81, 206–208, 434–436
source and content of, 193
unit, 200
walkthrough of, 218
WBS for, 159–160
Test procedure(s), 167–168
configuration management, 305–306
constants in, 333
context-independent, 331
design and development standards for, 544–545
development/execution schedule, 287, 292–295, 296–297
inspection of, 126
maintainable, 268, 309, 317–333
branching constructs, 329–331
constants, 333
context independence, 331–332
cosmetic coding standards, 317–319
error handling, 324–325
global files, 332–333
headers, 321–322
index, 324
looping constructs, 328–329
modularity of scripts, 327–328
naming standards for scripts, 325–327
synchronization, 323
test script comments, 319–320
test script documentation, 320–321
manual execution of, 304
peer reviews of, 304–305
reusable, 287, 301, 309, 310–317
application navigation, 312–313
automation wildcards, 315–316
bitmap image recording, 310, 313–315
capture/playback, 310, 316–317
data, 310–312
sequence and dependencies in, 293
standards for, 287
automated designs, 257, 266–271
manual designs, 272–274
synchronization with application execution, 323
template of, 321
walkthrough of, 126
Test procedure (case) generators, 77, 82–83, 282, 441–442
Test procedure definition, 224
documentation of, 196
Test procedure design, 256–282
documentation of
automated test design standards, 257, 266–271
automated versus manual test analysis of, 257, 262–266
detailed test design, 257, 274–277
manual test design standards, 272–274
naming convention, 261, 269–271, 272
procedure definition, 256, 257–261
test data requirements, 257, 277–282
Test procedure development, 50
guidelines for, 197
schedule for, 196
standards for, 287
Test Procedure Environment, 451
Test procedure execution metric, 373
Test procedure script, 305–306
Test process
adherence to, 131
corrective actions for problems in, 384–385
WBS for improving, 162–163
Test process analysis, 12, 110–133
documentation of, 111, 112, 113, 118
goals and objectives of, 116–120
organization’s quality and process improvement standards and, 110–111
process review, 112–116
purpose of, 111
strategies, 120–133
defect detection, 121, 126–133
defect prevention, 120–126
test planning and, 194
Test process stage of career development, 480–483
Test program
corrective actions for problems in, 384–385
development life cycle of, 223–224
risks in, 201–202
sizing, 483
unstructured, 108
Test program boundary, 201
Test program design, 224, 233–256
documentation of, 255–256
effective, 236–237
models of, 233–254
design-based, 234–236
technique-based, 236
process of, 224–225
Test/programming lead, 25
Test program review and assessment, 11, 14, 379–402
analysis, 382–392
corrective actions and improvement activity, 381–392
for environment-related problems, 391
for schedule-related problems, 384–385
for test program/process-related problems, 386–388
for tool-related problems, 388–390
return on investment, 392–401
Test/project management stage of career development, 485–487
Test/QA/development (project) manager, 25
Test readiness reviews (TRR), 212
Test result analysis, 51
Test shell. See Wrapper function
Test strategies, 9
Test success index, 375
Test team
best practices in assignments, 547–548
communication with other teams, 545–546
involvement in defect prevention, 122–123
involvement in requirements phase, 122–123
involvement in test process analysis, 113
organizational structure of, 149–157
centralized, 150, 151–153, 155–156, 178
independent verification and validation (IV&V), 150–151, 153–154, 156
systems methodology and test (SMT), 150, 151, 154–155, 156, 178
test effort sizing estimates and, 171
roles and responsibilities of, 182–186, 195
Test team management, 147–188
engineer recruiting, 172–182
activities for, 178
distinguishing best candidate, 181–182
interviews, 179–181
job requisition, 176–177
locating test engineers, 178–179
qualities and skills desired, 172–174
target composition of team and, 174–176
ingredients for success, 157
test effort sizing, 163–172
development ratio method, 165–166
factors in, 170–172
maturity levels and, 163–164
percentage method, 166–167
task planning method, 168–170, 194
test procedure method, 167–168
test program tasks, 157–163
Test tool(s). See also Evaluation and selection; specific test phases
calibration of, 302
capabilities of, 265
compatibility checks, 35, 541–542
corrective actions for problems related to, 388–390
ease of use of, 34–35
as enhancements of manual testing, 33
evaluation domain, 63–64
expectations of. See Expectations
expertise in, 61–62
improvement suggestions, 548
load-testing, 39
one-size-fits-all, 33
presentations and demonstrations of, 64
proposal for, 56–64
components of, 56–57
cost estimate, 60–61
estimated improvement opportunities and, 57–58, 59
evaluation domain, 63–64
rollout process, 64
selection criteria, 58–60
time required for introduction, 61
tool expertise, 61–62
training cost, 62–63
research on, 89–95
GUI testing (record/playback) tool, 90–95
return on investment of, 58, 392–401
rollout process, 64
test effort and, 33–34
test effort sizing estimates and proficiency in, 171
test planning and, 199–200
training cost for, 62–63
upgrades, 542–543
Test tool consideration, 12–13, 133–143
application-under-test overview, 134, 136, 137
compatibility check, 134–135, 136, 139–140
demonstration to project team, 134, 140–141
project schedule review, 134, 138–139
support profile, 141–143
system requirements review, 133–134, 135–137
training requirements, 134, 135, 143
Test tool database, 269
TestTrack, 428–430
Timed message boxes function, 344
TMON (Landmark), 137
Toastmasters, 483
Tools. See Test tool(s)
Traceability links, 205
Training costs, 62–63
Training courses, technical, 478, 479
Training organizations, 62
Training requirements, 134, 135, 143
Unconscious requirements, 410
Undreamed-of requirements, 410
Unified Modeling Language (UML), 79
Unit (module) development, development-test relationship in, 286
Unit phase of execution, 351–354
Unit stress tests, 249
execution and evaluation, 351–354
tools for, 447–449
Unit test plan, 200
Universal Testing Architecture, 470
University programs, 24–26
UNIX shell script test utilities, 137
UNIX Test Tools and Benchmarks (Wilson), 472
Upgrade(s), 102
database version, 120
test tool, 542–543
Usability measurement tools, 456–457
Usability test engineer, roles and responsibilities of, 184
UseCase, 436–437
Use case analysis, 230
Use case generators, 436–437
grouping requirements using, 414
test requirements from, 204
User acceptance test (UAT), 211, 253, 359–360
User-defined verification methods, 309, 334–335
User password, verification of, 36–37
Users
involvement of, 548
in black-box testing, 245
in defect detection, 132–133
license agreement for, 101
test requirement prioritization and requirements of, 209
tool evaluation and selection and, 71–72
Validator/Req, 441–442
Variable names, 326
VBXs, 35
Verification
build, 43–45
data, 71
defect detection through, 132
of GUI standards, 342
of user password, 36–37
Verifiers, requirements, 77, 81–82, 436
Video resolution, 330–331
Visual modeling tools, 82, 437–440
Visual Test Tool (McCabe), 85
WaitState method, 323
Walkthrough(s)
defect detection through, 126–127
defect prevention through, 124–125
of test plan, 218
Web, cost information from, 60
Web environment, automated testing in, 4
Web testing tools, 470–471
WHILE loop, 328
White-box testing, 118, 234, 237–244, 371–372
aim of, 239
automated tools supporting, 243–244
condition coverage, 243
coverage analysis, 242
decision coverage, 47, 227, 242–243
error handling testing, 240
fault insertion, 239–240
integration testing, 241
memory leak testing, 240–241
overview of, 238
string testing, 241–242
test data definition for, 277–278
unit testing, 240
Widgets, compatibility issues with, 129–130
Wildcards, automation, 315–316
Wilson, Rodney C., 472
WinRunner scripts, reusable functions of, 343–344
Work breakdown structure (WBS), 157–163
for test activities, 212
World Wide Web, 60
Wrapper function, 45–46