Appendix D. Sample Test Plan

WALLSTREET FINANCIAL TRADING SYSTEM

Delivery 2

Test Plan

(Date)

PREPARED FOR:

FINANCIAL TRADEWINDS CORPORATION

City, State

PREPARED BY:

AUTOMATION SERVICES INCORPORATED (AMSI)

Street Address

City, State

Contents

D.1 Introduction 493

D.1.1 Purpose 493

D.1.2 Background 493

D.1.3 System Overview 494

D.1.4 Applicable Documents 496

D.1.5 Master Schedule 498

D.2 Roles and Responsibilities 499

D.2.1 Project Organization 499

D.2.2 Project Roles and Responsibilities 500

D.2.3 Test Task Structure 503

D.2.4 Test Team Resources 509

D.3 Test Program 509

D.3.1 Scope 509

D.3.2 Test Approach 512

D.3.3 Test Strategies 515

D.3.4 Automated Tools 518

D.3.5 Qualification Methods 519

D.3.6 Test Requirements 519

D.3.7 Test Design 520

D.3.8 Test Development 524

D.4 Test Environment 526

D.4.1 Test Environment Configuration 526

D.4.2 Test Data 527

D.5 Test Execution 529

D.5.1 Test Program Reporting 529

D.5.2 Test Program Metrics 529

D.5.3 Defect Tracking 530

D.5.4 Configuration Management 532

D.6 Detailed Test Schedule 532

Appendixes

D.A Test Procedure Development Guidelines 534

D.B Test Verification Summary Matrix 536

D.C Test Procedures and Test Scripts 538

D.1 Introduction

D.1.1 Purpose

This test plan will outline and define the strategy and approach taken to perform testing on the WallStreet Financial Trading System (WFTS) project. It is intended for use by WFTS project personnel in understanding and carrying out prescribed test activities and in managing these activities through successful completion. This document defines the details of test responsibilities and activities and describes the tests to be conducted.

This test plan has been developed to fulfill the following objectives:

• To lay out the management and technical effort necessary to support testing throughout the system development life cycle

• To establish a comprehensive test plan that identifies the nature and extent of tests deemed necessary to achieve the testing objectives for the WFTS project, including software and hardware requirements.

• To coordinate an orderly schedule of events, identify equipment and organizational requirements, describe test methodologies and strategies to be used, and identify items to be delivered

• To provide a plan that outlines the contents of detailed test procedure scripts and the execution of those test procedure scripts (that is, which testing techniques will be used)

To help standardize the test effort and make it more efficient, test procedure development guidelines are provided in Appendix D.A. These guidelines have been adopted and are being implemented by the AMSI test team for the WFTS project. The test team will take advantage of testing tools to help improve and streamline the testing process. For further detail on the test strategy, see Section D.3.3 of this plan.

Test procedures are identified and tracked using the Dynamic Object-Oriented Requirements Management System (DOORS) requirements management tool. This approach will allow for easy management of test progress status. Once a test is performed, the test procedure status is revised within DOORS to reflect actual test results, such as pass/fail. Appendix D.B provides a test verification summary matrix that is generated using DOORS; it links the test procedures to test requirements so as to measure test coverage. Test procedures and test scripts supporting system acceptance test (SAT) are provided in Appendix D.C.

D.1.2 Background

The WFTS project was initiated in response to management’s recognition of the need for improvement within the service management operations at Financial Tradewinds Corporation (FTC). A mission element needs statement was developed and approved that authorized the establishment of a new system called the Wall-Street Financial Trading System (WFTS).

The project consists of several deliveries. Delivery 1 of the WFTS, which was implemented recently, provided system foundation applications. Delivery 2 involves the development of mission and support applications, which will enable FTC to trade securities and various assets on Wall Street more effectively.

The test requirements definition for the WFTS project is driven by detailed requirements/use cases/use case scenarios (see Section D.3.6) and by the evolutionary nature of additional user input. Use case requirements are maintained within the (DOORS) requirements management tool. Detailed WFTS use case requirements have been established for Delivery 2 and define test requirements and test procedures. Test documentation—test plans, test procedures, and test results—is captured and stored within DOORS. Additionally, PVCS Tracker will be used to manage software problem reports.

D.1.3 System Overview

This section provides an overview of the WFTS and identifies critical and high-risk functions of the system.

System Description

WFTS presently consists of a suite of hardware and software, including nondevelopmental items (NDI)/Commercial off-the-shelf (COTS) and developmental software. WFTS will provide FTC with daily trading and executive decision-making support. Automation Services Incorporated (AMSI) developed WFTS Delivery 1 and is under contract to develop and test Delivery 2. Figure D.1.1 depicts the WFTS Delivery 2 software architecture. Each block represents a software component (configuration item) of the system. Table D.1.1 summarizes the WFTS software components and their estimated COTS composition.

Figure D.1.1. WFTS Delivery 2 Software Architecture

image

Table D.1.1. WFTS Software Components

image

Critical and High-Risk Functions

During system requirements analysis and requirements specification development, the AMSI test team participated in the review of use case analysis results and WFTS joint application development (JAD) sessions. Critical success and high-risk functions of the WFTS system were identified. These functions include those most critical to the mission of the system and those that help mitigate the greatest risk to successful system operation. These functions have been ranked in priority sequence, as shown in Table D.1.2. This understanding of functional importance serves as an input to test team prioritization of test activities.

Table D.1.2. Critical and High-Risk Functions

image

D.1.4 Applicable Documents

Documents that are pertinent to the WFTS Delivery 2 test program are listed in this section.

Project Documentation

• System Requirements Specification, Delivery 2

• Use Case Scenario Document, Delivery 2

• Software Design Document, Delivery 2

• Interface Design Document, Delivery 2

WFTS Statement of Work (SOW)

• Concept of Operations

• Management Plan

• Software Development Plan

• Security Test Plan, Delivery 1

• Test Plan, Delivery 1

• Test Report, Delivery 1

• Security Certification Test Report, Delivery 1

• Delivery 2 Kick-Off Meeting Presentation Slides

• Security Requirements and Design Review Meeting Materials, Delivery 2

• Security Review Meeting Report

• User Interface Review Presentation Slides, Delivery 2

• System Implementation Plan, Delivery 2

• Security Plan, Delivery 2 (draft)

• Security Test Plan, Delivery 2 (draft)

Standards Documentation

• Automated Test Life-Cycle Methodology (ATLM)

• Test Procedure Design and Development Standards

• IEEE/EIA 12207 Information Technology Software Life-Cycle Process

• AMSI Standards and Procedures (standard process supporting business analysis phase, requirements phase, design phase, development phase, testing phase, and maintenance phase)

• AMSI Code Inspection Process

• AMSI Programming Style Guide

• AMSI GUI Style Guide

• AMSI Usability Style Guide

Tool Documentation

• TeamTest (Test Management Tool) User Manual

• PVCS Tracker Documentation

• Performance Studio Documentation

• DOORS (Requirements Management Tool) User Manual

PVCS (Configuration Management Tool) User Manual

• SystemArmor Security Guard Software Documentation

• UNIX Operating System Software Documentation

• InsitFul Securities Trade Visibility Software Documentation

D.1.5 Master Schedule

This section addresses the top-level schedule for the WFTS test program. The test program schedule contains the major events, activities, and deliverables involved in the test program. Activities performed by the test team include the design, development, and execution of tests, as well as inspections of project documentation and software products. The test team will also produce test documentation consisting of the items listed in Table D.1.3.

Table D.1.3. Test Documentation

image

The major events, activities, and documentation to be performed or prepared in support of the WFTS test program are outlined in the test program milestone schedule depicted in Figure D.1.2.

Figure D.1.2. Test Program Milestone Schedule

image

D.2 Roles and Responsibilities

Roles and responsibilities of the various groups are defined in this section.

D.2.1 Project Organization

Figure D.2.1 depicts the WFTS project organization. Reporting to the WFTS project manager are four line supervisors: the software development manager, the systems engineering manager, the product assurance manager, and the functional requirements manager. The software development manager is responsible for software and database design and development, as well as unit- and integration-level software tests. The systems engineering manager leads the system architecture design effort and is responsible for new COTS product evaluations. This manager maintains the network that supports the system development and test environments, and is responsible for database administration of the deployed Delivery 1 WFTS system. The product assurance manager is responsible for test, configuration management, and quality assurance activities.

Figure D.2.1. WFTS Project Organization

image

The test manager is responsible for system test and user acceptance test activities supporting the WFTS system. The functional requirements manager is responsible for requirements analysis, system requirements specification, and maintenance of the requirements baseline. Functional analyst personnel also support development and review of detailed design activities.

D.2.2 Project Roles and Responsibilities

D.2.2.1 Project Management

The project manager is responsible for client relations, project deliverables, schedules, and cost accounting. He or she coordinates with the particular line manager with regard to each technical task performed. The staff of project management specialists maintain project plans, schedules, and cost accounting information. Project management is responsible for ensuring that standards and procedures are followed and implemented appropriately.

D.2.2.2 Functional Requirements

The requirements group is responsible for requirements analysis and system requirements specification and for the derivation of subsequent use cases. This group also supports development and review of detailed design activities.

D.2.2.3 Software Development

The software development group is responsible for software development, as well as unit and integration software tests. It must develop software products in accordance with software development standards and conventions as specified in the software development plan (SDP). The software development group also performs unit and integration test phase planning. The results of unit and integration test phase planning are then provided as input to Section D.3 of the test plan.

For software development items, each developer will maintain a systems development folder (SDF) that contains the design documentation, printed copies of lines of code and user screens generated, development status of the item, and test results applicable to the item.

Test support responsibilities of the software development group include those described here.

Software Product Design and Development

When designing and developing any software or database product, the developer will comply with the software development standards and conventions specified in the SDP. Certain SDP provisions are automatically enforceable, such as the use of system development folders and compliance with the procedures associated with the use of the product development reuse library. Testability will be incorporated into the software as defined in the SDP. The third-party controls (widgets) defined for the development of this system must comply with the list of third-party controls that are compatible with the automated testing tool. The test team will be informed of peer reviews and code walkthroughs initiated by the development team.

Development Documentation

The development team will maintain SDFs. Embedded within the lines of programming code will be documentation in the form of comments. The embedded comments facilitate understanding of software structure and define the purpose of software routines. They will trace or correlate to pseudocode so as to facilitate software design traceability from the actual source code to the design document.

Unit Test Phase

Developers will test individual software units with respect to their function and integrity. Software unit program code will be analyzed to ensure that the code corresponds to functional requirements. Tracing tools will minimize code volume and eradicate dead code. Memory leakage tools will be applied, and code coverage tools will be used to verify that all paths have been tested. The system test team will perform unit testing in accordance with AMSI standards and procedures.

Integration Test Phase

Integration testing will be conducted to demonstrate the consistency between the software design and its implementation in accordance with AMSI standards and procedures. Its results will be recorded in the SDFs and inspected for software quality assurance. When software modules are ready to support the integration and system test phases, the source code and all files required for proper generation of the executables will be baselined within the software configuration management tool. Each software build will be generated using the source code products maintained within the software configuration management tool. The system test team will perform integration testing and verify completeness according to integration test procedures.

The software development group is also responsible for database design and development and all data migration and synchronization activities. Additionally, it helps the test group in setting up a test environment. The database group develops the database in accordance with database development standards and conventions as specified in the SDP.

D.2.2.4 Systems Engineering

The systems engineering group is responsible for the development of the system architecture design, integration of COTS products, research of COTS products, and evaluation of COTS products. As part of COTS integration, the systems engineering group will be responsible for the design and development of software modules as well as testing of the integrated COTS products. The systems engineering group will develop and maintain a simulation model of the WFTS using the OPNET simulation tool. The WFTS simulation model will simulate the major functions of the system and provide information on bottlenecks and queue buildups.

The systems engineering group maintains the network and hardware that work with the system development and test environments, and is responsible for database and system security administration of the deployed Delivery 1 WFTS system. The group installs and configures COTS products as required to integrate them with the rest of the system. The necessary parameters are defined for COTS products by this group and then set to work in the target environment. Hardware is installed and configured to reflect a typical end-user site. Upon receipt of the new system equipment that is destined for deployment at an end-user site, the appropriate hardware and system software configurations are installed.

D.2.2.5 Product Assurance

The product assurance group implements test, configuration, and quality assurance activities. The system test team performs various test activities supporting the WFTS system by following the ATLM. It takes responsibility for system test and user acceptance test activities supporting the WFTS system; it also carries out the unit and integration test phases described in Section D.3.

The system test team develops the test plan and procedures, and it performs the tests necessary to ensure compliance with functional, performance, and other technical requirements. Test program activities include the maintenance of test automation reuse libraries, planning and execution of tests, and the development of test reports. These responsibilities are detailed below.

Test Procedures Development

Test procedures will be prepared for system-level testing that provide the inspector with a step-by-step (test script) operational guide to performing each test. They will exercise both system software (COTS and developmental items) and hardware.

Test procedures will include the test procedure title, test description, references to the system specification, prerequisite conditions for the test, test execution steps (script), expected results, data requirements for the test, acceptance criteria, and actual results. Those test procedures to be used for site acceptance testing will be identified as a result of input from end users.

Unit and Integration Test Phase

The system test team will witness unit and integration test activities.

System Test Phase

The system test team is responsible for system testing; the scope of this testing is described in Section D.3. The test team will document results within the requirements management tool and produce progress reports, as detailed in Section D.1.5.

System Acceptance Test (SAT) Phase

The system test team performs user acceptance testing, as described in Section D.3. The test team will document the results within the requirements management tool and produce progress reports as specified in Section D.1.5.

Test Reports

Raw test data and reports will be kept to indicate the specific pass/fail results of all system hardware and software tests. The test team will prepare a test report at the conclusion of system and user acceptance testing, which will include the raw test data, reports, and a test results summary, together with conclusions and recommendations.

Field/Site Acceptance Testing

This step will involve checkout and performance testing to ensure that equipment and software are installed correctly. Test activities will include verification that the system performs in accordance with specifications and is capable of meeting operational requirements. Site acceptance tests will consist of a reduced set of confirmation tests providing a reasonable check that the system is ready for operation.

D.2.3 Test Task Structure

Table D.2.1 indicates the types of test tasks that may be performed by the system test team for the WFTS test program. This task structure represents the work breakdown structure (WBS) that will be used by the test team to support cost accounting activities on the project.

Table D.2.1. Test Program Work Breakdown Structure

image

image

image

image

image

image

D.2.4 Test Team Resources

The composition of the WFTS test team is outlined within the test team profile depicted in Table D.2.2. This table identifies the test team positions on the project together with the names of the personnel who will fill these positions. The duties to be performed by each person are described, and the skills of the individuals filling the positions are documented. The last two columns reflect the years of experience for each test team member with regard to total test program experience as well as years of experience with the designated test management tool for the project.

Table D.2.2. Test Team Profile

image

The WFTS test team includes both full-time resources and personnel who aid in testing on a part-time basis. The phases supported by each test team member and the availability during each test phase is outlined in Table D.2.3.

Table D.2.3. Test Team Personnel Availability

image

The WFTS test team will need to have a working knowledge of several tools for its test program. Table D.2.4 outlines the experience of the test team members with the test management, requirements management, configuration management, and defect tracking tools. The last column indicates the training required for each test team member.

Table D.2.4. Test Team Training Requirements

image

D.3 Test Program

D.3.1 Scope

The WFTS test program is aimed at verifying that the Delivery 2 WFTS system satisfies the requirements/derived use cases and is ready to be deployed in the FTC’s production environment. The test program involves the implementation of a number of test strategies across several test phases, including unit, integration, system, user acceptance, and site acceptance testing.

System-level test effort consists of functional testing, performance testing, backup and recoverability testing, security testing, and verification of system availability measures. Separate security testing is applied to ensure that necessary security mechanisms perform as specified. Site acceptance testing will be performed in association with site installation and checkout activities.

Tests will be comprehensive enough to cover the network, hardware, software application, and databases. Software tests will focus on NDI/COTS and developmental software. The unit and integration test phases will involve tests of newly created or modified software as well as COTS products incorporated in WFTS Delivery 2 development effort, as noted in Table D.1.1. System and user acceptance tests will exercise Delivery 2 development products and perform a regression testing on the existing Delivery 1 application software. Thus the complete WFTS system will be reviewed.

D.3.2 Test Approach

When developing the WFTS test approach, the test team reviewed system requirements/derived use cases and use case scenarios; it also studied the system description and critical/high-risk function information described in Section D.1.3. Using this information, the test team performed a test process analysis exercise to identify a test life cycle. In addition, it analyzed the test goals and objectives that could be applied on the WFTS test effort. The results of these analyses appear in Table D.3.1.

Table D.3.1. Test Process Analysis Documentation

image

In addition to identifying test goals and objectives, the test team documented test program parameters, including its assumptions, prerequisites, system acceptance criteria, and risks.

D.3.2.1 Assumptions

The test team developed this plan with the understanding of several assumptions concerning the execution of the WFTS project and the associated effect on the test program.

Test Performance

The test team will perform all tests on the WFTS project with the exception of those unit and integration phase tests, which are performed by the system developers and witnessed by the system test group.

Security Testing

System security tests, designed to satisfy the security test requirements outlined within the security test plan, will be executed during system testing and will be incorporated into the test procedure set constituting the system acceptance test (SAT).

Early Involvement

The test team will be involved with the WFTS application development effort from the beginning of the project, consistent with the ATLM. Early involvement includes the review of requirement statements and use cases/use case scenarios and the performance of inspections and walkthroughs.

Systems Engineering Environment

The suite of automated tools and the test environment configuration outlined within this plan are based upon existing systems engineering environment plans outlined within the WFTS management plan and the software development plan. Changes in the systems engineering environment will require subsequent and potentially significant changes to this plan.

Test Team Composition

The test team will include three business area functional analysts. These analysts will be applied to the system test effort according to their functional area expertise. While these analysts are on loan to the test group, they will report to the test manager regarding test tasks and be committed to the test effort. They will support the test effort for the phases and percentages of their time as noted in section D.2.4.

Test Limitations

Given the resource limitations of the test program and the limitless number of test paths and possible input values, the test effort has been designed to focus effort and attention on the most critical and high-risk functions of the system. Defect tracking and its associated verification effort, likewise, focus on assessing these functions and meeting acceptance criteria, so as to determine when the AUT is ready to go into production.

Project Schedule

Test resources defined within the test plan are based upon the current WFTS project schedule and requirement baseline. Changes to this baseline will require subsequent changes to this plan.

D.3.2.2 Test Prerequisites

The WFTS test program schedule depicted in Figure D.1.2 includes the conduct of a system walkthrough. This walkthrough involves a demonstration that system test procedures are ready to support user acceptance testing.

The conduct of this walkthrough and subsequent performance of SAT requires that certain prerequisites be in place. These prerequisites may include activities, events, documentation, and products. The prerequisites for the WFTS test program execution are as follows:

• The full test environment configuration is in place, operational, and under CM control.

• The test data environment has been established and baselined.

• All detailed unit and integration test requirements have been successfully exercised as part of the unit and integration test phases.

• Materials supporting test-by-inspection and certification methods are on hand. Materials representing evidence of test-by-analysis are on hand.

• The system test procedure execution schedule is in place.

• Automated test procedure reuse analysis has been conducted.

• A modularity-relationship model has been created.

• System test procedures have been developed in accordance with standards.

• The WFTS system baseline software has been installed in the test environment and is operational.

D.3.2.3 System Acceptance Criteria

The WFTS test program within the AMSI test environment concludes with the satisfaction of the following criteria. In accordance with the test schedule depicted in Figure D.1.2, two site acceptance tests are performed following completion of these criteria.

• SAT has been performed.

• Priority 1–3 software problem reports reported during SAT and priority 2–3 software problem reports that existed prior to SAT have been resolved. The test group has verified the system corrections implemented to resolve these defects.

• A follow-up SAT has been conducted, when required, to review test procedures associated with outstanding priority 1–3 software problem reports. Successful closure of these software problem reports has been demonstrated.

• A final test report has been developed by the test team and approved by FTC.

D.3.2.4 Risks

Risks to the test program (see Table D.3.2) have been identified, assessed for their potential effects, and then mitigated with a strategy for overcoming the risk should it be realized.

Table D.3.2. Test Program Risks

image

D.3.3 Test Strategies

Drawing on the defined test goals and objectives and using the ATLM as a baseline, the test team defined the test strategies that will be applied to support the WFTS test program. The test team will utilize both defect prevention and defect removal technologies as shown in Table D.3.3.

Table D.3.3. Test Strategies and Techniques

image

The AMSI test team will execute the SAT. It will develop test threads to exercise the requirements specified in the detailed requirements/use case documents. The test procedures will specify how a test engineer should execute the test by defining the input requirements and the anticipated results. The detail of this information is controlled through the DOORS test tool and is available on-line. The DOORS database serves as the repository for system requirements and test requirements.

The DOORS requirements management tool is used for managing all systems requirements, including business, functional, and design requirements. It is also used for capturing test requirements and test procedures, thus allowing for simple management of the testing process. Using the DOORS scripting language and the associated .dxl files, the test team can automatically create a traceability matrix that will measure the coverage progress of test procedures per test requirements. In turn, test procedures will be derived from the detailed business requirements and use cases and stored in the DOORS database.

The highest-risk functionality has been identified, and the test effort will focus on this functionality. Reuse analysis will be conducted of existing test procedures to avoid rework of automated test procedures available from previous testing efforts. If the automated test tool is not compatible with some of the functionality and no feasible automation work-around solutions can be found, tests will be executed manually.

A modularity model will be created that depicts the relationships among the test procedures. Test procedures will be broken down and assigned to the various test engineers, based on the requirements category and the test engineer’s business knowledge and expertise. Progress will be monitored and test procedure walkthroughs will be conducted to verify the accuracy of test procedures and to discover any discrepancies with the business requirement.

The WFTS system will be modeled for scalability using the simulation modeling tool OPNET. This model will simulate the major functions of the WFTS system and provide information about bottlenecks and queue buildups. Inputs to OPNET include arrival rates of the various transactions, the sizes of the transactions, and the processing times at the various stages of the process flow. After the model is built, it must be validated against the test data obtained from the performance testing process. Once this validation is complete, the model can be used to examine what-if scenarios and to predict performance under varying conditions.

D.3.4 Automated Tools

The test team for the WFTS project will use the automated test tools listed in Table D.3.4. The development team uses the PureCoverage and Purify tools during unit testing. During system acceptance testing, the test team will use TestStudio. The application will be analyzed for functionality that lends itself to automation. This strategy will streamline the process of creating and testing certain redundant transactions. Test scripts will be developed following the test procedure development guidelines defined in Appendix D.A.

Table D.3.4. Automated Test Tools

image

If software problems are detected, the team will generate defect reports. Software problem reports will be reported to system developers through PVCS Tracker. The DOORS database supports the FTC repository for system requirements, test requirements, and related software problem reports.

TestStudio will be used as the GUI automated test tool. DOORS will serve as the requirements management tool. Performance Studio will be used for performance and stress testing. TestStudio Test Procedure (Case) Generator will be used to create a baseline of test procedures.

D.3.5 Qualification Methods

For each test requirement, a testability indicator/qualification method will be used. The following qualification methods will be employed in test procedure steps to verify that requirements have been met:

Inspection. Inspection verifies conformance to requirements by visual examination, review of descriptive documentation, and comparison of the actual characteristics with predetermined criteria.

Demonstration. Demonstration verifies conformance to requirements by exercising a sample of observable functional operations. This method is appropriate for demonstrating the successful integration, high-level functionality, and connectivity provided by NDI and COTS software. NDI and COTS products are certified by vendors to have been developed and tested in accordance with software development and quality processes.

Tests. Testing verifies conformance to requirements by exercising observable functional operations. This method is generally more extensive than that used in demonstrations and is appropriate for requirements fulfilled by developmental items.

Manual Tests. Manual tests will be performed when automated tests are not feasible.

Automated Tests. When automation analysis outcome is positive, the test procedures will be automated.

Analysis. Analysis verifies conformance to requirements by technical evaluation, processing, review, or study of accumulated data.

Certification. Certification verifies conformance to requirements by examination of vendor (or supplier) documentation attesting that the product was developed and tested in accordance with the vendor’s internal standards.

D.3.6 Test Requirements

Test requirements have been derived from requirements/use cases/use case scenarios developed for the application. In the requirements traceability matrix maintained within the DOORS database, system requirements are mapped to test requirements. The test team worked with the project manager and development team to prioritize system requirements for testing purposes. The test team entered the priority values within DOORS, as shown in the test verification summary matrix depicted in Appendix D.B.

D.3.7 Test Design

D.3.7.1 Test Program Model

Armed with a definition of test requirements and an understanding of the test techniques that are well suited to the WFTS test program, the test team developed the test program model, which depicts the scope of the test program. The model includes test techniques that will be employed at the development test and system test levels as well as the applicable static test strategies, as shown in Figure D.3.1.

Figure D.3.1. Test Program Model

image

D.3.7.2 Test Architecture

Having defined a test program model, the test team next constructed a test architecture for the WFTS project. The test architecture depicts the structure of the test program, defining the way that test procedures will be organized in the test effort. Figure D.3.2 depicts the test architecture for the WFTS project, where development-level tests are design-based and system-level tests are technique-based.

Figure D.3.2. Sample Test Architecture

image

The design components shown in Figure D.3.2 were retrieved by the test team from the project’s software architecture. Five components are being tested at the development level: System Management (SM-06), Security Guard (SG-07), Distributed Computing (DC-08), Support Applications (SA-09), and Active Trade Visibility (TV-10). For each of these design components, the test techniques that will be applied are noted.

D.3.7.3 Test Procedure Definition

A preliminary step in the test design process involves the development of the test procedure definition, which aids in test development and helps to bound the test effort. The test procedure definition identifies the suite of test procedures that must be developed and executed for the test effort. The design exercise involves the organization of test procedures into logical groups and the allocation of test procedure number series for each set of tests required.

Table D.3.5 depicts a sample test procedure definition for development-level tests. Column 1 of this table identifies the series of test procedure numbers allotted for testing of the particular design component using the particular technique. Column 2 lists the software or hardware design components to be tested. The design components referenced are retrieved from the test architecture. The test technique is listed in column 3, and the number of test procedures involved in each set of tests (row) is estimated in column 4.

Table D.3.5. Test Procedure Definition (Development Test Level)

image

Table D.3.6 depicts a sample test procedure definition for system-level tests. Column 1 of this table identifies the series of test procedures allotted for each particular test technique. Column 2 lists the test technique.

Table D.3.6. Test Procedure Definition (System Test Level)

image

Columns 3 through 5 provide information to specify the number of test procedures involved at the system test level. The number of design units or functional threads required for the tests is given in column 3. Four functional threads are planned to support stress and performance testing. Note that usability tests will be conducted as part of functional testing; as a result, no additional test procedures are needed for this test technique. The number of system requirements involved in the tests is identified in column 4, and the number of test requirements is given in column 5.

The last column estimates the number of test procedures required for each test technique listed. For functional and security testing, there may be one test procedure for every test requirement. For stress and performance testing, four threads are planned that will need to be altered for each test procedure to examine different system requirements.

D.3.7.4 Test Procedure Naming Convention

With the test procedure definition in place for both the development and system levels, the test team adopted a test procedure naming convention to uniquely identify the test procedures on the project. Table D.3.7 provides the test procedure naming scheme for the WFTS project.

Table D.3.7. Test Procedure Naming Convention

image

With the various tests defined, the test team identified the test procedures that warrant automation and those that can be performed most efficiently via manual methods. Table D.3.8 depicts a portion of a traceability matrix that is maintained using DOORS, which breaks down each test procedure required for system-level testing. Each test procedure in Table D.3.8 is cross-referenced to several other elements, such as design component and test technique. The last column identifies whether the test will be performed using an automated test tool (A) or manually (M).

Table D.3.8. Automated versus Manual Tests

image

D.3.8 Test Development

Tests are automated based on the automation analysis outcome of the test design phase, as shown in Table D.3.8. They are developed in accordance with the test procedure execution schedule and the modularity-relationship model. Test development must be consistent with the test development guidelines provided in Appendix D.A. Additionally, test procedures will be developed using the automatic test procedure generation feature of the TestStudio test tool.

The test team prepared a test development architecture, depicted in Figure D.3.3, that provides a clear picture of the test development activities (building blocks) necessary to create test procedures. The test development architecture illustrates the major activities to be performed as part of test development.

Figure D.3.3. Test Development Architecture

image

To conduct its test development activities efficiently, the test team performed an analysis to identify the potential for reuse of existing test procedures and scripts within the AMSI automation infrastructure (reuse library). The results of this reuse analysis are maintained using the DOORS tool and are depicted in Table D.3.9.

Table D.3.9. Automation Reuse Analysis

image

D.4 Test Environment

D.4.1 Test Environment Configuration

The test environment mirrors the production environment. This section describes the hardware and software configurations that compose the system test environment. The hardware must be sufficient to ensure complete functionality of the software. Also, it should support performance analysis aimed at demonstrating field performance. Information concerning the test environment pertinent to the application, database, application server, and network is provided below.

Application

Visual Basic 5.0

Iona’s Orbix V2.3

Microsoft’s Internet Information Server

Neonet V3.1

MQ Series V.20

Windows NT V4.0 service pack 3

Application Server

Dual-processor PC, 200MHz Pentium processors

256MB Memory

4–6GB hard disk, CD-ROM drive

2 Syngoma 503E SNA boards

Microsoft SNA Server 3.0

Digital DCE 1.1C with Eco patch

Encina 2.5 with patches

Windows NT 4.0 with service pack 3

Database

Sybase 11 Server V11.x.1 application server

Microsoft’s SNA Server V4.0

Digital DCE Client and Server with Eco patch V1.1c

Encina V2.5 with patches

Workstation

Windows NT V4.0 service pack 3

Iona’s Orbix V2.3

Sybase Configuration

Application: Sybase 11 Open Client CT-Lib V11.1.0

Database: Sybase 11 Server V11.x.1

Sun Solaris for the database server

Network Configuration

Ethernet switched network

Baseline test laboratory equipment for WFTS central site configurations was acquired for development and testing performed in support of Deliver 1 WFTS system. Delivery 2 requirements involve additional functionality, and as a result of the scope of the test effort must be modified accordingly. Two site configurations must be added to the WFTS test lab configuration. The procurement of additional hardware and software resources is reflected in the test equipment list given in Table D.4.1.

Table D.4.1. Test Equipment Purchase List

image

D.4.2 Test Data

Working in conjunction with the database group, the test team will create the test database. The test database will be populated with unclassified production data. The configuration management group will baseline the test environment, including the test database. Additionally, during performance testing, test data will be generated using Rational’s Performance Studio tool. These data will be baselined in the PVCS configuration management tool. To assure adequate testing depth (volume of test database of 10 records versus 10,000 records), the test team will mirror the production-size database during performance testing. To assure adequate testing breadth (variation of data values), it will use data with many variations, again mirroring the production data environment. Test data will use the procedure data definitions, whenever possible.

Table D.4.2 is a matrix that cross-references test data requirements to each individual test procedure that is planned for system testing.

Table D.4.2. System Test Data Definition

image

D.5 Test Execution

D.5.1 Test Program Reporting

An earned value management system will be used to track test program progress, including cost and schedule measures. Earned value involves tracking of the value of completed work relative to planned costs and actual costs, so as to provide a true measure of cost status and to enable AMSI’s personnel to define effective corrective actions. Four primary steps make up the earned value process:

  1. Identify short tasks (functional test phase).
  2. Schedule each task (task start date and end date).
  3. Assign a budget to each task (task will require 3,100 hours using four test engineers).
  4. Measure the progress of each task (schedule and cost variance).

The primary tasks to be performed by the test team have been identified consistent with the work breakdown structure outlined in Table D.2.1. A detailed test schedule has been prepared identifying each task. For each task, timeframes have been determined and hours and personnel have been allocated. The SAT test execution schedule is detailed in Section D.6.

After a test procedure has been executed, the test team will undertake evaluation activities to assure that the test outcome was not the result of a false-positive or false-negative condition. The test procedure status is then revised with the requirements management tool to reflect actual test results, such as full, partial, or failed demonstration of compliance with the expected outcome, as defined in the test procedure.

D.5.2 Test Program Metrics

Table D.5.1 shows the test progress metrics that will be collected and reported. The quality assurance group will report on the quality metrics.

Table D.5.1. Test Program Metrics

image

D.5.3 Defect Tracking

To track defects, a defect workflow process has been implemented. Defect workflow training will be conducted for all test engineers. The steps in the defect workflow process are as follows:

  1. When a defect is generated initially, the status is set to “New.” (Note: How to document the defect, what fields need to be filled in, and so on also need to be specified.)
  2. The tester selects the type of defect:

    • Bug

    • Cosmetic

    • Enhancement

    • Omission

  3. The tester then selects the priority of the defect:

    • Critical—fatal error

    • High—needs immediate attention

    Medium—needs to be resolved as soon as possible but not a showstopper

    • Low—cosmetic error

  4. A designated person (in some companies, the software manager; in other companies, a special board) evaluates the defect and assigns a status and makes modifications of type of defect and/or priority if applicable).

    The status “Open” is assigned if it is a valid defect.

    The status “Close” is assigned if it is a duplicate defect or user error. The reason for “closing” the defect needs to be documented.

    The status “Deferred” is assigned if the defect will be addressed in a later release.

    The status “Enhancement” is assigned if the defect is an enhancement requirement.

  5. If the status is determined to be “Open,” the software manager (or other designated person) assigns the defect to the responsible person (developer) and sets the status to “Assigned.”
  6. Once the developer is working on the defect, the status can be set to “Work in Progress.”
  7. After the defect has been fixed, the developer documents the fix in the defect tracking tool and sets the status to “fixed,” if it was fixed, or “Duplicate,” if the defect is a duplication (specifying the duplicated defect). The status can also be set to “As Designed,” if the function executes correctly. At the same time, the developer reassigns the defect to the originator.
  8. Once a new build is received with the implemented fix, the test engineer retests the fix and other possible affected code. If the defect has been corrected with the fix, the test engineer sets the status to “Close.” If the defect has not been corrected with the fix, the test engineer sets the status to “Reopen.”

Defect correction is the responsibility of system developers; defect detection is the responsibility of the AMSI test team. The test leads will manage the testing process, but the defects will fall under the purview of the configuration management group. When a software defect is identified during testing of the application, the tester will notify system developers by entering the defect into the PVCS Tracker tool and filling out the applicable information.

AMSI test engineers will add any attachments, such as a screen print, relevant to the defect. The system developers will correct the problem in their facility and implement the operational environment after the software has been baselined. This release will be accompanied by notes that detail the defects corrected in this release as well as any other areas that were changed as part of the release. Once implemented, the test team will perform a regression test for each modified area.

The naming convention for attachments will be defect ID (yyy), plus Attx (where x = 1, 2, 3. . . n) (for example, the first attachment for defect 123 should be called 123Att1). If additional changes have been made other than those required for previously specified software problem reports, they will be reviewed by the test manager, who will evaluate the need for additional testing. If deemed necessary, the manager will plan additional testing activities. He will have the responsibility for tracking defect reports and ensuring that all reports are handled on a timely basis.

D.5.4 Configuration Management

The CM department is responsible for all CM activities and will verify that all parties involved are following the defined CM procedures. System developers will provide object code only for all application updates. It is expected that system developers will baseline their code in a CM tool before each test release. The AMSI test team will control the defect reporting process and monitor the delivery of associated program fixes. This approach will allow the test team to verify that all defect conditions have been properly addressed.

D.6 Detailed Test Schedule

A detailed SAT test schedule (portion of schedule) is provided in Table D.6.1.

Table D.6.1. Test Schedule

image

Appendix D.A Test Procedure Development Guidelines

AMSI’s standard test procedures development guidelines for the WFTS project are outlined below. These guidelines are available in the AMSI CM library.

Table D.A.1. Test Development Guidelines

image

image

Appendix D.B Test Verification Summary Matrix

A description of the columns contained within the test verification summary matrix is provided in Table D.B.1, and the actual test verification summary matrix for WFTS Delivery 2 is provided in Table D.B.2. The test verification summary matrix represents an example of the type of requirements traceability matrix that can be generated using DOORS. This matrix links the test procedures to test requirements, enabling the test team to verify the test coverage.

Table D.B.1. Test Verification Summary Matrix Terminology

image

Table D.B.2. Test Verification Summary Matrix

image

Appendix D.C Test Procedures and Test Scripts

Manual test procedures supporting SAT are documented within the DOORS database. Automated test procedures and test scripts supporting SAT are maintained using the TeamTest test tool.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset