Home Page Icon
Home Page
Table of Contents for
Four. Incorporating Specialized Testing Responsibilities
Close
Four. Incorporating Specialized Testing Responsibilities
by Perry, William E., William E. Perry Enterprises, Inc.
Effective Methods for Software Testing, Third Edition
Copyright
Dedication
About the Author
Credits
Introduction
Getting the Most Out of This Book
What’s New in the Third Edition
What’s on the CD
One. Assessing Testing Capabilities and Competencies
1. Assessing Capabilities, Staff Competency, and User Satisfaction
The Three-Step Process to Becoming a World-Class Testing Organization
Step 1: Define a World-Class Software Testing Model
Customizing the World-Class Model for Your Organization
Step 2: Develop Baselines for Your Organization
Assessment 1: Assessing the Test Environment
Implementation Procedures
Building the Assessment Team
Completing the Assessment Questionnaire
Building the Footprint Chart
Assessing the Results
Verifying the Assessment
Assessment 2: Assessing the Capabilities of Your Existing Test Processes
Assessment 3: Assessing the Competency of Your Testers
Implementation Procedures
Understanding the CSTE CBOK
Completing the Assessment Questionnaires
Building the Footprint Chart
Assessing the Results
Verifying the Assessment
Step 3: Develop an Improvement Plan
Summary
Two. Building a Software Testing Environment
2. Creating an Environment Supportive of Software Testing
Minimizing Risks
Risk Appetite for Software Quality
Risks Associated with Implementing Specifications
Faulty Software Design
Data Problems
Risks Associated with Not Meeting Customer Needs
Developing a Role for Software Testers
Writing a Policy for Software Testing
Criteria for a Testing Policy
Methods for Establishing a Testing Policy
Economics of Testing
Testing—An Organizational Issue
Management Support for Software Testing
Building a Structured Approach to Software Testing
Requirements
Design
Program
Test
Installation
Maintenance
Developing a Test Strategy
Use Work Paper 2-1
Use Work Paper 2-2
Summary
3. Building the Software Testing Process
Software Testing Guidelines
Guideline #1: Testing Should Reduce Software Development Risk
Guideline #2: Testing Should Be Performed Effectively
Guideline #3: Testing Should Uncover Defects
Defects Versus Failures
Why Are Defects Hard to Find?
Guideline #4: Testing Should Be Performed Using Business Logic
Guideline #5: Testing Should Occur Throughout the Development Life Cycle
Guideline #6: Testing Should Test Both Function and Structure
Why Use Both Testing Methods?
Structural and Functional Tests Using Verification and Validation Techniques
Workbench Concept
Testing That Parallels the Software Development Process
Customizing the Software-Testing Process
Determining the Test Strategy Objectives
Determining the Type of Development Project
Determining the Type of Software System
Determining the Project Scope
Identifying the Software Risks
Determining When Testing Should Occur
Defining the System Test Plan Standard
Defining the Unit Test Plan Standard
Converting Testing Strategy to Testing Tactics
Process Preparation Checklist
Summary
4. Selecting and Installing Software Testing Tools
Integrating Tools into the Tester’s Work Processes
Tools Available for Testing Software
Selecting and Using Test Tools
Matching the Tool to Its Use
Selecting a Tool Appropriate to Its Life Cycle Phase
Matching the Tool to the Tester’s Skill Level
Selecting an Affordable Tool
Training Testers in Tool Usage
Appointing Tool Managers
Prerequisites to Creating a Tool Manager Position
Selecting a Tool Manager
Assigning the Tool Manager Duties
Limiting the Tool Manager’s Tenure
Summary
5. Building Software Tester Competency
What Is a Common Body of Knowledge?
Who Is Responsible for the Software Tester’s Competency?
How Is Personal Competency Used in Job Performance?
Using the 2006 CSTE CBOK
Developing a Training Curriculum
Using the CBOK to Build an Effective Testing Team
Summary
Three. The Seven-Step Testing Process
6. Overview of the Software Testing Process
Advantages of Following a Process
The Cost of Computer Testing
Quantifying the Cost of Removing Defects
Reducing the Cost of Testing
The Seven-Step Software Testing Process
Objectives of the Seven-Step Process
Customizing the Seven-Step Process
Managing the Seven-Step Process
Using the Tester’s Workbench with the Seven-Step Process
Workbench Skills
Summary
7. Step 1: Organizing for Testing
Objective
Workbench
Input
Do Procedures
Task 1: Appoint the Test Manager
Task 2: Define the Scope of Testing
Task 3: Appoint the Test Team
Internal Team Approach
External Team Approach
Non-IT Team Approach
Combination Team Approach
Task 4: Verify the Development Documentation
Development Phases
Measuring Project Documentation Needs
Determining What Documents Must Be Produced
Determining the Completeness of Individual Documents
Determining Documentation Timeliness
Task 5: Validate the Test Estimate and Project Status Reporting Process
Validating the Test Estimate
Strategies for Software Cost Estimating
Parametric Models
Testing the Validity of the Software Cost Estimate
Validate the Reasonableness of the Estimating Model
Validate That the Model Includes All the Needed Factors
Verify the Correctness of the Cost-Estimating Model Estimate
Calculating the Project Status Using a Point System
Overview of the Point Accumulation Tracking System
Typical Methods of Measuring Performance
Using the Point System
Extensions
Rolling Baseline
Reports
Check Procedures
Output
Summary
8. Step 2: Developing the Test Plan
Overview
Objective
Concerns
Workbench
Input
Do Procedures
Task 1: Profile the Software Project
Conducting a Walkthrough of the Customer/User Area
Developing a Profile of the Software Project
Task 2: Understand the Project Risks
Task 3: Select a Testing Technique
Structural System Testing Techniques
Stress Testing
Objectives
How to Use Stress Testing
When to Use Stress Testing
Execution Testing
Objectives
How to Use Execution Testing
When to Use Execution Testing
Recovery Testing
Objectives
How to Use Recovery Testing
When to Use Recovery Testing
Operations Testing
Objectives
How to Use Operations Testing
When to Use Operations Testing
Compliance Testing
Objectives
How to Use Compliance Testing
When to Use Compliance Testing
Security Testing
Objectives
How to Use Security Testing
When to Use Security Testing
Functional System Testing Techniques
Requirements Testing
Objectives
How to Use Requirements Testing
When to Use Requirements Testing
Regression Testing
Objectives
How to Use Regression Testing
When to Use Regression Testing
Error-Handling Testing
Objectives
How to Use Error-Handling Testing
When to Use Error-Handling Testing
Manual-Support Testing
Objectives
How to Use Manual-Support Testing
When to Use Manual-Support Testing
Intersystem Testing
Objectives
How to Use Intersystem Testing
When to Use Intersystem Testing
Control Testing
Objectives
How to Use Control Testing
When to Use Control Testing
Parallel Testing
Objectives
How to Use Parallel Testing
When to Use Parallel Testing
Task 4: Plan Unit Testing and Analysis
Functional Testing and Analysis
Functional Analysis
Functional Testing
Testing Independent of the Specification Technique
Testing Dependent on the Specification Technique
Structural Testing and Analysis
Structural Analysis
Structural Testing
Error-Oriented Testing and Analysis
Statistical Methods
Error-Based Testing
Fault-Based Testing
Managerial Aspects of Unit Testing and Analysis
Selecting Techniques
Control
Task 5: Build the Test Plan
Setting Test Objectives
Developing a Test Matrix
Individual Software Modules
Structural Attributes
Batch Tests
Conceptual Test Script for Online System Test
Verification Tests
Software/Test Matrix
Defining Test Administration
Test Plan General Information
Define Test Milestones
Define Checkpoint Administration
Writing the Test Plan
Task 6: Inspect the Test Plan
Inspection Concerns
Products/Deliverables to Inspect
Formal Inspection Roles
Moderator
Reader
Recorder
Author
Inspectors
Formal Inspection Defect Classification
Inspection Procedures
Planning and Organizing
Overview Session
Individual Preparation
Inspection Meeting
Rework and Follow-Up
Check Procedures
Output
Guidelines
Summary
9. Step 3: Verification Testing
Overview
Objective
Concerns
Workbench
Input
The Requirements Phase
The Design Phase
The Programming Phase
Do Procedures
Task 1: Test During the Requirements Phase
Requirements Phase Test Factors
Preparing a Risk Matrix
Establishing the Risk Team
Identifying Risks
Establishing Control Objectives (Requirements Phase Only)
Identifying Controls in Each System Segment
Determining the Adequacy of Controls
Performing a Test Factor Analysis
Conducting a Requirements Walkthrough
Establishing Ground Rules
Selecting the Team
Presenting Project Requirements
Responding to Questions/Recommendations
Issuing the Final Report (Optional)
Performing Requirements Tracing
Ensuring Requirements Are Testable
Task 2: Test During the Design Phase
Scoring Success Factors
Analyzing Test Factors
Conducting a Design Review
Inspecting Design Deliverables
Task 3: Test During the Programming Phase
Desk Debugging the Program
Syntactical Desk Debugging
Structural Desk Debugging
Functional Desk Debugging
Performing Programming Phase Test Factor Analysis
Conducting a Peer Review
Establishing Peer Review Ground Rules
Selecting the Peer Review Team
Training Team Members
Selecting a Review Method
Conducting the Peer Review
Drawing Conclusions
Preparing Reports
Check Procedures
Output
Guidelines
Summary
10. Step 4: Validation Testing
Overview
Objective
Concerns
Workbench
Input
Do Procedures
Task 1: Build the Test Data
Sources of Test Data/Test Scripts
Testing File Design
Defining Design Goals
Entering Test Data
Applying Test Files Against Programs That Update Master Records
Creating and Using Test Data
Payroll Application Example
Creating Test Data for Stress/Load Testing
Creating Test Scripts
Determining Testing Levels
Developing Test Scripts
Executing Test Scripts
Analyzing the Results
Maintaining Test Scripts
Task 2: Execute Tests
Task 3: Record Test Results
Documenting the Deviation
Documenting the Effect
Documenting the Cause
Check Procedures
Output
Guidelines
Summary
11. Step 5: Analyzing and Reporting Test Results
Overview
Concerns
Workbench
Input
Test Plan and Project Plan
Expected Processing Results
Data Collected during Testing
Test Results Data
Test Transactions, Test Suites, and Test Events
Defects
Efficiency
Storing Data Collected During Testing
Do Procedures
Task 1: Report Software Status
Establishing a Measurement Team
Creating an Inventory of Existing Project Measurements
Developing a Consistent Set of Project Metrics
Defining Process Requirements
Developing and Implementing the Process
Monitoring the Process
Summary Status Report
Project Status Report
Task 2: Report Interim Test Results
Function/Test Matrix
Functional Testing Status Report
Functions Working Timeline Report
Expected Versus Actual Defects Uncovered Timeline Report
Defects Uncovered Versus Corrected Gap Timeline Report
Average Age of Uncorrected Defects by Type Report
Defect Distribution Report
Normalized Defect Distribution Report
Testing Action Report
Interim Test Report
Task 3: Report Final Test Results
Individual Project Test Report
Integration Test Report
System Test Report
Acceptance Test Report
Check Procedures
Output
Guidelines
Summary
12. Step 6: Acceptance and Operational Testing
Overview
Objective
Concerns
Workbench
Input Procedures
Task 1: Acceptance Testing
Defining the Acceptance Criteria
Developing an Acceptance Plan
Executing the Acceptance Plan
Developing Test Cases (Use Cases) Based on How Software Will Be Used
Building a System Boundary Diagram
Defining Use Cases
Developing Test Cases
Reaching an Acceptance Decision
Task 2: Pre-Operational Testing
Testing New Software Installation
Testing the Changed Software Version
Testing the Adequacy of the Restart/Recovery Plan
Verifying the Correct Change Has Been Entered into Production
Verifying Unneeded Versions Have Been Deleted
Monitoring Production
Documenting Problems
Task 3: Post-Operational Testing
Developing and Updating the Test Plan
Developing and Updating the Test Data
Testing the Control Change Process
Identifying and Controlling Change
Documenting Change Needed on Each Data Element
Documenting Changes Needed in Each Program
Conducting Testing
Developing and Updating Training Material
Training Material Inventory Form
Training Plan Work Paper
Preparing Training Material
Conducting Training
Check Procedures
Output
Is the Automated Application Acceptable?
Automated Application Segment Failure Notification
Is the Manual Segment Acceptable?
Training Failure Notification Form
Guidelines
Summary
13. Step 7: Post-Implementation Analysis
Overview
Concerns
Workbench
Input
Do Procedures
Task 1: Establish Assessment Objectives
Task 2: Identify What to Measure
Task 3: Assign Measurement Responsibility
Task 4: Select Evaluation Approach
Task 5: Identify Needed Facts
Task 6: Collect Evaluation Data
Task 7: Assess the Effectiveness of Testing
Using Testing Metrics
Check Procedures
Output
Guidelines
Summary
Four. Incorporating Specialized Testing Responsibilities
14. Software Development Methodologies
How Much Testing Is Enough?
Software Development Methodologies
Overview
Methodology Types
Waterfall Methodology
Prototyping Methodology
Rapid Application Development Methodology
Spiral Methodology
Incremental Methodology
The V Methodology
Software Development Life Cycle
Phase 1: Initiation
Phase 2: Definition
Phase 3: System Design
Phase 4: Programming and Testing
Phase 5: Evaluation and Acceptance
Phase 6: Installation and Operation
Roles and Responsibilities
Defining Requirements
Categories
Attributes
Desired Attributes: A Systems Analyst Perspective
Requirements Measures: A Tester’s Perspective
International Standards
Methodology Maturity
Competencies Required
Staff Experience
Configuration-Management Controls
Basic CM Requirements
Configuration Identification
Configuration Control
Configuration-Status Accounting
Configuration Audits
Planning
Data Distribution and Access
CM Administration
Project Leader’s CM Plan
Work Breakdown Structure
Technical Reviews
Configuration Identification
CI Selection
Document Library
Software Development Library
Configuration Baselines
Initial Release
Software Marking and Labeling
Interface Requirements
Configuration Control
Measuring the Impact of the Software Development Process
Summary
15. Testing Client/Server Systems
Overview
Concerns
Workbench
Input
Do Procedures
Task 1: Assess Readiness
Software Development Process Maturity Levels
The Ad Hoc Process (Level 1)
The Repeatable Process (Level 2)
The Consistent Process (Level 3)
The Measured Process (Level 4)
The Optimized Process (Level 5)
Conducting the Client/Server Readiness Assessment
Preparing a Client/Server Readiness Footprint Chart
Task 2: Assess Key Components
Task 3: Assess Client Needs
Check Procedures
Output
Guidelines
Summary
16. Rapid Application Development Testing
Overview
Objective
Concerns
Testing Iterations
Testing Components
Testing Performance
Recording Test Information
Workbench
Input
Do Procedures
Testing Within Iterative RAD
Spiral Testing
Task 1: Determine Appropriateness of RAD
Task 2: Test Planning Iterations
Task 3: Test Subsequent Planning Iterations
Task 4: Test the Final Planning Iteration
Check Procedures
Output
Guidelines
Summary
17. Testing Internal Controls
Overview
Internal Controls
Control Objectives
Preventive Controls
Source-Data Authorization
Data Input
Source-Data Preparation
Turnaround Documents
Prenumbered Forms
Input Validation
File Auto-Updating
Processing Controls
Detective Controls
Data Transmission
Control Register
Control Totals
Documenting and Testing
Output Checks
Corrective Controls
Error Detection and Resubmission
Audit Trails
Cost/Benefit Analysis
Assessing Internal Controls
Task 1: Understand the System Being Tested
Task 2: Identify Risks
Task 3: Review Application Controls
Task 4: Test Application Controls
Testing Without Computer Processing
Testing with Computer Processing
Test-Data Approach
Mini-Company Approach
Transaction Flow Testing
Objectives of Internal Accounting Controls
Systems Control Objectives
Financial Planning and Control Objectives
Cycle Control Objectives
Results of Testing
Task 5: Document Control Strengths and Weaknesses
Quality Control Checklist
Summary
18. Testing COTS and Contracted Software
Overview
COTS Software Advantages, Disadvantages, and Risks
COTS Versus Contracted Software
COTS Advantages
COTS Disadvantages
Implementation Risks
Testing COTS Software
Testing Contracted Software
Objective
Concerns
Workbench
Input
Do Procedures
Task 1: Test Business Fit
Step 1: Testing Needs Specification
Step 2: Testing CSFs
Task 2: Test Operational Fit
Step 1: Test Compatibility
Step 2: Integrate the Software into Existing Work Flows
Step 3: Demonstrate the Software in Action
Task 3: Test People Fit
Task 4: Acceptance-Test the Software Process
Step 1: Create Functional Test Conditions
Step 2: Create Structural Test Conditions
Modifying the Testing Process for Contracted Software
Check Procedures
Output
Guidelines
Summary
19. Testing in a Multiplatform Environment
Overview
Objective
Concerns
Background on Testing in a Multiplatform Environment
Workbench
Input
Do Procedures
Task 1: Define Platform Configuration Concerns
Task 2: List Needed Platform Configurations
Task 3: Assess Test Room Configurations
Task 4: List Structural Components Affected by the Platform(s)
Task 5: List Interfaces the Platform Affects
Task 6: Execute the Tests
Check Procedures
Output
Guidelines
Summary
20. Testing Software System Security
Overview
Objective
Concerns
Workbench
Input
Where Vulnerabilities Occur
Functional Vulnerabilities
Vulnerable Areas
Accidental Versus Intentional Losses
Do Procedures
Task 1: Establish a Security Baseline
Why Baselines Are Necessary
Creating Baselines
Establish the Team
Set Requirements and Objectives
Design Data Collection Methods
Train Participants
Collect Data
Analyze and Report Security Status
Using Baselines
Task 2: Build a Penetration-Point Matrix
Controlling People by Controlling Activities
Selecting Security Activities
Interface Activities
Development Activities
Operations Activities
Controlling Business Transactions
Characteristics of Security Penetration
Building a Penetration-Point Matrix
Task 3: Analyze the Results of Security Testing
Evaluating the Adequacy of Security
Check Procedures
Output
Guidelines
Summary
21. Testing a Data Warehouse
Overview
Concerns
Workbench
Input
Do Procedures
Task 1: Measure the Magnitude of Data Warehouse Concerns
Task 2: Identify Data Warehouse Activity Processes to Test
Organizational Process
Data Documentation Process
System Development Process
Access Control Process
Data Integrity Process
Operations Process
Backup/Recovery Process
Performing Task 2
Task 3: Test the Adequacy of Data Warehouse Activity Processes
Check Procedures
Output
Guidelines
Summary
22. Testing Web-Based Systems
Overview
Concerns
Workbench
Input
Do Procedures
Task 1: Select Web-Based Risks to Include in the Test Plan
Security Concerns
Performance Concerns
Correctness Concerns
Compatibility Concerns
Browser Configuration
Reliability Concerns
Data Integrity Concerns
Usability Concerns
Recoverability Concerns
Task 2: Select Web-Based Tests
Unit or Component
Integration
System
User Acceptance
Performance
Load/Stress
Regression
Usability
Compatibility
Task 3: Select Web-Based Test Tools
Task 4: Test Web-Based Systems
Check Procedures
Output
Guidelines
Summary
Five. Building Agility into the Testing Process
23. Using Agile Methods to Improve Software Testing
The Importance of Agility
Building an Agile Testing Process
Agility Inhibitors
Is Improvement Necessary?
Compressing Time
Challenges
Solutions
Measuring Readiness
The Seven-Step Process
Summary
24. Building Agility into the Testing Process
Step 1: Measure Software Process Variability
Timelines
Process Steps
Workbenches
Time-Compression Workbenches
Reducing Variability
Developing Timelines
Identifying the Workbenches
Measuring the Time for Each Workbench via Many Testing Projects
Defining the Source of Major Variability in Selected Workbenches
Improvement Shopping List
Quality Control Checklist
Conclusion
Step 2: Maximize Best Practices
Tester Agility
Software Testing Relationships
Operational Software
Software Quality Factors
Tradeoffs
Capability Chart
Measuring Effectiveness and Efficiency
Defining Measurement Criteria
Measuring Quality Factors
Defining Efficiency and Effectiveness Criteria
Measuring Effectiveness
Measuring Efficiency
Building Effectiveness and Efficiency Metrics
Identifying Best Practices from Best Projects
Improvement Shopping List
Quality Control Checklist
Conclusion
Step 3: Build on Strength, Minimize Weakness
Effective Testing Processes
Assessing the Process
Developing and Interpreting the Testing Footprint
Poor Testing Processes
Improvement Shopping List
Quality Control Checklist
Conclusion
Step 4: Identify and Address Improvement Barriers
The Stakeholder Perspective
Stakeholder Involvement
Performing Stakeholder Analysis
Red-Flag/Hot-Button Barriers
Staff-Competency Barriers
Administrative/Organizational Barriers
Determining the Root Cause of Barriers/Obstacles
Addressing the Root Cause of Barriers/Obstacles
Quality Control Checklist
Conclusion
Step 5: Identify and Address Cultural and Communication Barriers
Management Cultures
Culture 1: Manage People
Why Organizations Continue with Culture 1
Why Organizations Might Want to Adopt Culture 2
Culture 2: Manage by Process
Why Organizations Continue with Culture 2
Why Organizations Might Want to Adopt Culture 3
Culture 3: Manage Competencies
Why Organizations Continue with Culture 3
Why Organization Might Want to Adopt Culture 4
Culture 4: Manage by Fact
Why Organizations Continue with Culture 4
Why Organizations Might Want to Adopt Culture 5
Culture 5: Manage Business Innovation
Cultural Barriers
Identifying the Current Management Culture
Identifying the Barriers Posed by the Culture
Determining What Can Be Done in the Current Culture
Determining the Desired Culture for Time Compression
Determining How to Address Culture Barriers
Open and Effective Communication
Lines of Communication
Information/Communication Barriers
Effective Communication
Quality Control Checklist
Conclusion
Step 6: Identify Implementable Improvements
What Is an Implementable?
Identifying Implementables via Time Compression
Prioritizing Implementables
Documenting Approaches
Quality Control Checklist
Conclusion
Step 7: Develop and Execute an Implementation Plan
Planning
Implementing Ideas
Preparing the Work Plan
Checking the Results
Taking Action
Requisite Resources
Quality Control Checklist
Conclusion
Summary
Search in book...
Toggle Font Controls
Playlists
Add To
Create new playlist
Name your new playlist
Playlist description (optional)
Cancel
Create playlist
Sign In
Email address
Password
Forgot Password?
Create account
Login
or
Continue with Facebook
Continue with Google
Sign Up
Full Name
Email address
Confirm Email Address
Password
Login
Create account
or
Continue with Facebook
Continue with Google
Prev
Previous Chapter
13. Step 7: Post-Implementation Analysis
Next
Next Chapter
14. Software Development Methodologies
Part Four. Incorporating Specialized Testing Responsibilities
Add Highlight
No Comment
..................Content has been hidden....................
You can't read the all page of ebook, please click
here
login for view all page.
Day Mode
Cloud Mode
Night Mode
Reset