13 DO-330 and Software Tool Qualification

Acronym

CNS/ATM communication, navigation, surveillance, and air traffic management
COTS commercial off-the-shelf
FAA Federal Aviation Administration
FAQ frequently asked question
MC/DC modified condition/decision coverage
MISRA Motor Industry Software Reliability Association
PSAC Plan for Software Aspects of Certification
SAS Software Accomplishment Summary
SC-205 Special Committee #205
SLECI Software Life Cycle Environment Configuration Index
TOR Tool Operational Requirement
TQL tool qualification level
TQP Tool Qualification Plan
WG-71 Working Group #71

13.1 Introduction

A software tool is “a computer program or a functional part thereof, used to help develop, transform, test, analyze, produce, or modify another program, its data, or its documentation” [1]. The use of tools in the software life cycle process, as well as other domains such as systems, programma ble hardware, and aeronautical databases has exploded in recent years. I recently worked on a project that utilized around 50 software tools. It was a huge project with many specialized needs, so it is not typical. However, it does show how the use of tools has increased and will continue to increase. Table 13.1 identifies some common ways that tools are used in the software life cycle and divides them into the following categories: development, verification, and other.

Table 13.1 Examples of Tool Usage in the Software Life Cycle

Development Tools Verification Tools Other Tools
  • Requirements capture and management

  • Design

  • Modeling

  • Text editing

  • Compiling

  • Linking

  • Automatic code generation

  • Configuration file generation

  • Debugging

  • Static analysis

  • Worst-case execution timing analysis

  • Model verification

  • Coding standards conformity

  • Trace verification

  • Structural coverage analysis

  • Automatic testing

  • Emulation

  • Simulation

  • Automatic test generation

  • Configuration file verification

  • Formal methods

  • Project management

  • Configuration management

  • Problem reporting

  • Peer review management

Tools cannot replace the human brain; however, they can prevent errors and identify errors that humans might insert or fail to identify. Tools can help engineers do their job better and allow them to concentrate on the more challenging problems that require engineering skill and judgment.

Some engineers are resistant to using tools, while others go completely overboard and use tools for everything. There is a happy medium that must be reached when using tools to successfully develop software that performs its intended function.

In 2004, I was privileged to lead the Software Tools Forum. The forum was cosponsored by Embry-Riddle Aeronautical University and the Federal Aviation Administration (FAA). The purpose of the effort was to assess the state of tools in aviation and to identify issues that needed to be resolved to enable the industry to safely reap the benefits of tools. Approximately 150 participants from industry, government, and academia attended. The objectives of the forum were to (1) share information on software tools used in aviation projects, (2) discuss lessons learned to date regarding software tools used in aviation, (3) discuss the challenges of tools in safety-critical systems, and (4) consider next steps.

At the end of the forum, a brainstorming session was held to identify some of the most significant issues regarding effective use of tools in aviation. The brainstorming session identified six needs categories, which were prioritized by the industry. The first five categories are specific; the last one is a catchall for miscellaneous tool issues. The six categories are listed here in order of priority (items 1–3 were weighted much higher than items 4–6):

  1. Development tool qualification criteria need to be modified.

  2. Criteria for model-based development tools need to be established.

  3. Criteria that enable tool qualification credit to be carried from one program to another need to be developed.

  4. Different approaches to autocode generator usage and qualification need to be developed and documented.

  5. Integration tools pose new challenges that need to be addressed.

  6. A number of miscellaneous tool issues need to be addressed.

Soon after the tools event, the RTCA Special Committee #205 (SC-205) and EUROCAE Working Group #71 (WG-71) joint committee was formed to update DO-178C and provide necessary guidance in a number of techni cal areas. The recommendations from the Software Tools Forum and other sources were used as input to the committee. The SC-205/WG-71 tool qualification subgroup* took the issues seriously and worked to document guidance that would address them. The results were an update to DO-178C section 12.2 and the production of DO-330, entitled Software Tool Qualification Considerations. During the 6.5 years that it took to finish the SC-205/WG-71 effort, I also reviewed and approved dozens of software tools and their qualification data. The subject of software tool qualification is one near and dear to my heart.

This chapter provides an overview of DO-178C section 12.2 and DO-330 in order to explain when a tool needs to be qualified, what level of qualification is required, and how to qualify a tool. DO-330 is 128 pages long; therefore, only an overview is provided. The attention in this chapter is on the most critical aspects of DO-330 for tool developers and tool users. The differences between the DO-178B and the updated guidance are also discussed, since there are numerous tools in use today that were qualified using DO-178B criteria. In addition, some special topics related to tool qualification and some potential pitfalls to avoid when qualifying or using qualified tools are discussed.

13.2 Determining Tool Qualification Need and Level (DO-178C Section 12.2)

Table 13.1 identifies common uses for tools in the software life cycle. Some of those tools may need to be qualified and others may not. This section examines DO-178C section 12.2 to answer the following questions:

  • What is tool qualification?

  • When is it required?

  • To what level must a tool be qualified?

  • How do DO-178B and DO-178C guidance on tool qualification differ?

Tool qualification is the process used to gain certification credit for a software tool whose output is not verified, when the tool eliminates, reduces, or automates processes required by DO-178C. Tool qualification is granted in conjunction with the approval of the software that uses the tool; it is not a stand-alone approval. The tool qualification process provides confidence in the tool functionality; the confidence is at least equivalent to the process(es) being eliminated, reduced, or automated [2]. The rigor required for the tool qualification effort “varies based upon the potential impact that a tool error could have on the system safety and upon the overall use of the tool in the software life cycle process. The higher the risk of a tool error adversely affecting system safety, the higher the rigor required for tool qualification” [2]. Figure 13.1 graphically shows the process to determine if a tool needs to be qualified and to what level.

Images

Figure 13.1 Determining tool qualification need and level.

DO-178B defined two types of tools: (1) software development tools and (2) software verification tools. Because these two categories were often tied to the life cycle phase rather than the tool impact, DO-178C does not use the terms development tools and verification tools. Instead, DO-178C identifies three criteria which focus on the potential impact the tool could have. Table 13.2 shows the comparison of the DO-178B and DO-178C tool categories.

The qualification process for Criteria 1 tools under DO-178C is very similar to development tools in DO-178B. Examples of Criteria 1 tools include autocode generators, configuration file generators, compilers, linkers, requirements management tools, design tools, and modeling tools. Likewise, the qualification process for Criteria 3 tools under DO-178C is about the same as verification tools in DO-178B. Examples of these tools include test case generators, automated test tools, structural coverage tools, data coupling analysis tools, and static code analyzers. The primary difference between DO-178B and DO-178C is that DO-178C introduces a third category of tools: the Criteria 2 tool. The need for this special category was primarily driven by considering tools in the future that may make more critical decisions than today’s verification tools but do not have as much impact on the resultant software as today’s development tools. Some formal methods tools fall into this category. For example, a single proof tool may be used to automate some of the source code verification steps and to reduce the amount of testing needed. Either of these actions alone would make it a Criteria 3 tool, but the combination pushes it into the Criteria 2 tool realm. In this situation, the proof tool is used to verify process(es) other than that automated by the tool. Another example of a Criteria 2 tool is a static code analyzer which is used to replace source code review (verification step) and to reduce design mechanisms to detect overflow (development step). This tool performs some verification but it also reduces the software development process [1]. I tend to think of Criteria 2 tools as super verification tools. With a single tool, you can knock out objectives from multiple DO-178C Annex A tables. Traditional verification tools normally just automate objectives from a single DO-178C Annex A table (e.g., Table A-6 or A-7), but these super verification tools might satisfy objectives from multiple tables (e.g., A-5 through A-7). I do not expect that there will be many Criteria 2 tools; they really should be the exception and not the rule. However, the criteria was included in DO-330 to ensure that appropriate insight into the tool design is visible when a single tool carries out multiple purposes.

Table 13.2 DO-178B Tool Categories Compared to DO-178C Tool Criteria

DO-178C Tool Qualification Criteria and Definitions DO-178B Tool Categories and Definitions
Development tool: Tool whose output is part of airborne software and thus can introduce errors Criteria 1: A tool whose output is part of the resulting software and thus could insert an error
Verification tool: Tool that cannot introduce errors but may fail to detect them

Criteria 2: A tool that automates verification process(es) and thus could fail to detect an error, and whose output is used to justify the elimination or reduction of:

  • Verification process(es) other than that automated by the tool, or

  • Development process(es) that could have an impact on the airborne software

Criteria 3: A tool that, within the scope of its intended use, could fail to detect an error

Sources: RTCA DO-178C, Software Considerations in Airborne Systems and Equipment Certification, RTCA, Inc., Washington, DC, December 2011; RTCA DO-178B, Software Considerations in Airborne Systems and Equipment Certification, RTCA, Inc., Washington, DC, December 1992.

Based on the three criteria and the level of the software that the tool supports, a tool qualification level (TQL) is assigned. The TQL determines the amount of rigor required during the qualification process. There are a total of five TQLs; TQL-1 requires the most rigor and TQL-5 the least. Table 13.3 shows the TQL for each tool criteria and software level. Table 13.4 shows the correlation between the DO-178B software levels approach and the DO-178C TQLs. The intent of DO-178C and DO-330 is to make the tool qualification criteria clearer than what was included in DO-178B, not to change what is required. Therefore, in most cases, tools qualified under DO-178B will also be acceptable under DO-178C.

Table 13.3 Tool Qualification Level

Criteria
Software Level 1 2 3
A TQL-1 TQL-4 TQL-5
B TQL-2 TQL-4 TQL-5
C TQL-3 TQL-5 TQL-5
D TQL-4 TQL-5 TQL-5

Source: RTCA DO-178C, Software Considerations in Airborne Systems and Equipment Certification, RTCA, Inc., Washington, DC, December 2011. DO-178C Table 12-2 used with permission from RTCA, Inc.

Table 13.4 Correlation between DO-178B and DO-178C Levels

DO-178B Tool Qualification Type DO-178B Software Level DO-178C TQL
Development A TQL-1
Development B TQL-2
Development C TQL-3
Development D TQL-4
Verification All TQL-4a or TQL-5b

Source: RTCA DO-330, Software Tool Qualification Considerations, RTCA, Inc., Washington, DC, December 2011. DO-330 Table D-1 used with permission from RTCA, Inc.

aTQL-4 applies for Criteria 2 tools used on Level A and B software.

bTQL-5 applies for Criteria 2 tools used on Level C and D software, as well as Criteria 3 tools.

13.3 Qualifying a Tool (DO-330 Overview)

Once the TQL is established using DO-178C section 12.2, DO-330 provides guidance for qualifying the tool. This section explains why DO-330 is needed and the DO-330 tool qualification process.

13.3.1 Need for DO-330

Over the last few years, the number and types of tools have grown. Furthermore, the tools are often developed by third party vendors who may have little understanding of DO-178C. DO-330 was developed to provide tool-specific guidance for both tool developers and users. The document focuses on tools used in the software life cycle. However, the document may also be applied to other domains, such as programmable hardware, databases, and systems. How the document is applied to the domain will be dictated by the domain’s guidance (e.g., DO-178C, DO-254 [3], DO-200A [4]).

The SC-205/WG-71 committee debated quite vigorously whether or not a separate tool qualification document was needed. There are advantages and disadvantages of generating such a document. Table 13.5 summarizes some of the pros and cons. These and others were considered during committee’s deliberations. In the end, it was decided that the benefits of a stand-alone document outweighed the consequences and that a stand-alone document was the best long-term solution for the aviation industry as a whole.

While developing DO-330, the SC-205/WG-71 committee kept the following goals in mind:

  • Maintain the DO-178B approach for traditional verification tools (as much as possible).

  • Develop an approach that will support emerging tool technologies.

  • Provide an approach that enables reuse of tool qualification credit on multiple projects.

    Table 13.5 Pros and Cons of a Stand-Alone Tool Qualification Document

    Pros for a Stand-Alone Document Cons for a Stand-Alone Document
    • For many projects there are actually more lines of code for the tools than there are for the airborne software; therefore, specific guidance is needed.

    • Tools form a separate domain from the airborne software. A stand-alone document allows the tool-specific needs to be addressed head-on rather than in generalities. This prevents misinterpretations.

    • A stand-alone document provides guidance specifically for tool developers, who may have little to no understanding of DO-178C.

    • Since other domains (such as programmable hardware, systems, aeronautical databases) also use tools, a stand-alone document can benefit more than just the software domain.

    • There is considerable redundancy between the tool qualification document (DO-330) and DO-178C.

    • The tool qualification guidance is difficult to apply when tools are developed using nontraditional technologies (such as model-based development, object-oriented technology, or formal methods) (i.e., the supplements explain how they apply to DO-178C and DO-278A but not DO-330).

    • Another document leads to challenges when maintaining the guidance; particularly when there is overlap between the document and DO-178C.

    • An extra document may lead to additional work by the developers. Particularly for those who only qualify lower level tools (verification tools in DO-178B or TQL-5 in DO-178C).

  • Identify general tool user and tool developer roles (in order to address integration of a tool into the development environment, to support reuse, and to help with integration of commercial off the-shelf [COTS] tools).

  • Develop an approach that is clear but flexible for tool developers and users.

  • Develop an objective-based approach (to help with reuse and flexibility).

  • Provide an approach that may be adopted by multiple domains.

13.3.2 DO-330 Tool Qualification Process

Once the need to qualify and the applicable TQL are determined (using DO-178C section 12.2), DO-330 provides the guidance and objectives for qualifying the tool. DO-330 is organized similar to DO-178C. The tool development goes through a life cycle, just like the airborne software. The tool life cycle processes include planning, requirements, design, verification, configuration management, quality assurance, and qualification liaison. Table 13.6 shows the DO-178C and DO-330 table of contents side by side, illustrating that DO-330 is organized very similar to DO-178C. DO-178C was used as the foundation for DO-330; then the content was modified to make it tool-specific.

Because DO-178C and DO-330 are similar and DO-178C was examined in depth in Chapters 5 through 12, this section emphasizes 18 primary differences between DO-178C and DO-330.

Difference 1: Different domains. As already mentioned, DO-178C applies to the airborne software domain, whereas DO-330 applies to the tools that may be used in the software life cycle. Additionally, DO-330 may be used to qualify tools used in other domains (such as communication, navigation, surveillance, and air traffic management [CNS/ATM] software, systems, and electronic hardware).

Difference 2: Introductory sections are different. DO-178C (section 2) includes an overview of the systems process and the software levels. DO-330 (sections 2 and 3) explains the purpose and characteristics of tool qualification. The information in DO-330 sections 2 and 3 is similar to what has already been discussed in this chapter.

Difference 3: DO-330 combines the life cycle process and planning sections. DO-178C has separate sections for software life cycle and planning (sections 3 and 4). However, DO-330 combines these subjects into a single section (section 4) since the life cycle plays a significant role in the planning process.

Difference 4: DO-330 adds the word tool to life cycle processes and data. In order to distinguish the tool development process from the airborne software development processes, the term tool is added where appropriate. For example, source code in DO-178C is called tool source code in DO-330.

Table 13.6 Comparison of DO-178C and DO-330 Table of Contents

DO-178C Table of Contents [2] DO-330 Table of Contents [1]
  1. Introduction

  2. System aspects relating to software development

  3. Software life cycle

  4. Software planning process

  5. Software development processes

  6. Software verification processes

  7. Software configuration management process

  8. Software quality assurance process

  9. Certification liaison process

  10. Overview of certification process

  11. Software life cycle data

  12. Additional considerations

Annex A—Process objectives and outputs by software level

Annex B—Acronyms and glossary of terms

Appendix A—Background of DO-178 document

Appendix B—Committee membership

  1. Introduction

  2. Purpose of tool qualification

  3. Characteristics of tool qualification

  4. Tool qualification planning process

  5. Tool development life cycle and processes

  6. Tool verification processes

  7. Tool configuration management process

  8. Tool quality assurance process

  9. Tool qualification liaison process

  10. Tool qualification data

  11. Additional considerations for tool qualification

Annex A—Tool qualification objectives

Annex B—Acronyms and glossary of terms

Appendix A—Membership list

Appendix B—Example of determination of applicable tool qualification levels

Appendix C—Frequently asked questions related to tool qualification for all domains

Appendix D—Frequently asked questions related to tool qualification for airborne software and CNS/ATM software domains

Difference 5: DO-330 enhances the information required in the PSAC and SAS when qualified tools are used. DO-178C sections 11.1 and 11.20 identify the expected contents of the Plan for Software Aspects of Certification (PSAC) and Software Accomplishment Summary (SAS). DO-330 sections 10.1.1 and 10.1.16 identify additional information that should be added in the PSAC and SAS when qualified tools are to be used in the software life cycle processes. Per DO-330 section 10.1.1, the following information should be added to the PSAC for each qualified tool that will be used [1]:

  • Identify the tool and its intended use in the software life cycle process.

  • Explain credit sought for the tool (i.e., explain what processes or objectives it eliminates, reduces, or automates).

  • Justify the maturity of the technology automated by the tool (e.g., if the tool is used to perform formal proofs, one needs to ensure the maturity of the approach in general before automating it).

  • Propose the TQL, along with rationale for why that TQL is adequate.

  • Identify tool developer or source.

  • Explain division of roles and responsibilities for the tool qualification. In particular, explain what the tool user and tool developer are doing and who is satisfying which DO-330 objectives.

  • Explain the Tool Operational Requirements (TORs) development, tool integration, and tool operational validation and verification processes.

  • Identify the intended tool operational environment and confirm that it is representative of the environment used during tool verification.

  • Explain any additional considerations for the tool (e.g., reuse, COTS usage, service history).

  • Reference the Tool Qualification Plan (TQP) where additional information can be found on the tool qualification details (for TQL-1–TQL-4). (For TQL-5, a TQP is not required.)

DO-330 section 10.1.16 also explains additional information that should be included in the SAS when utilizing qualified tools in the software life cycle. Table 13.7 summarizes the additional items for TQL-1–TQL-4 and TQL-5.

Difference 6: DO-330 distinguishes tool operation from tool development. DO-178B did not distinguish these and it led to some confusion and limited reuse. In order to promote tool reuse and clarity, DO-330 provides guidance for tool development and verification (performed by a tool developer) and tool operation (performed by a tool user). However, it gives the flexibility for each project to assign roles and responsibilities appropriate for the situation. In general, DO-330 Annex A Tables T-0 and T-10 objectives primarily apply to the tool user/operator, whereas the DO-330 Annex A Tables T-1 through T-9 objectives mainly apply to the tool developer.

Table 13.7 Additional Inclusions in the SAS When Using Qualified Tools

TQL-1–TQL-4 TQL-5
  • Tool identification (the specific part number or other identifier).

  • Details of the credit sought for the tool (explanation of what processes and/or objectives it eliminates, reduces, or automates).

  • Reference to the Tool Accomplishment Summary where additional information can be found on the tool qualification details.

  • Statement confirming that the tool development, verification, and integral processes comply with the tool plans, including the tool user activities.

  • Summary of all open tool problem reports along with an analysis to ensure that the behavior of the tool complies with the TORs.

  • Summary of any variance in tool usage from what was explained in the PSAC, if applicable.

  • Tool identification (the specific part number or other identifier).

  • Details of the credit sought for the tool (explanation of what processes and/or objectives it eliminates, reduces, or automates).

  • Listing of or reference to tool qualification data (including the version information).

Source: RTCA DO-330, Software Tool Qualification Considerations, RTCA, Inc., Washington, DC, December 2011.

Difference 7: DO-330 guidance requires operational requirements in order to document the user’s needs. In DO-178C the system requirements drive the highlevel software requirements. However, for a tool, system requirements don’t exist; instead, TORs drive the tool requirements. The TORs identify how the tool is used within the specific software life cycle process. Multiple users of a common tool could use it differently; therefore, the TORs are user specific. The TOR document includes a description of the following [1]:

  • The context of the tool usage in the software life cycle, interfaces with other tools, and integration of the tool output files into the airborne software.

  • The operational environment(s) where the tool will be used.

  • The input to the tool, including file format, language description, etc.

  • The format and content summary of the tool’s output files.

  • Functional tool requirements.

  • Requirements for abnormal activation modes or inconsistency inputs that should be detected by the tool (needed for TQL-1–TQL-4, but not required for TQL-5).

  • Performance requirements (the behavior of the tool output).

  • User information that explains how to properly use the tool.

  • Explanation of how to operate the tool (including selected options, parameters values, command line, etc.).

Difference 8: DO-330’s requirements types differ. DO-178C identifies system requirements, high-level software requirements, and low-level software requirements. DO-330 identifies TORs (same basic level as DO-178C’s system requirements), tool requirements (same intent as DO-178C’s software high-level requirements), and low-level tool requirements (same level of granularity as DO-178C’s low-level software requirements). In some situations three levels of requirements may not be needed. For example, the TORs and tool requirements might be a single set of requirements, or the tool requirements and the low-level tool requirements might be one level of requirements.* Even if levels of requirements are merged, all of the applicable objectives of DO-330 need to be satisfied. As with DO-178C, bidirectional traceability between the various levels of tool requirements is needed.

Difference 9: DO-330 requires validation of the TORs. In DO-178C, the system requirements that come into the software process are validated (using an ARP4754A-compliant process); therefore, DO-178C doesn’t require requirements validation. However, TORs are not validated by the system process; therefore, they must be validated for correctness and completeness.

Difference 10: Per DO-330 derived requirements for a tool are evaluated for impact on tool functionality and TORs, rather than the safety assessment process. Because tools do not fly, their impact on the safety assessment process is indirect. Instead, the impact of derived requirements on the software life cycle, expected functionality, and TORs is evaluated. As with DO-178C, all derived requirements need to be justified. Users may not know the details of the tool design; therefore, the justification for derived requirements helps them properly assess the impact on their software life cycle. The justifications should be written with the user in mind.

Difference 11: DO-330 uses the concept of tool operational environment instead of target computer. DO-330 defines the tool operational environment as: “The environment and life cycle process context in which the tool is used. It includes workstation, operating system, and external dependencies, for example interfaces to other tools and manual process(es)” [1]. Tools are normally launched on desktop computers rather than embedded in the avionics. Therefore, the concept of a target computer (which is used in DO-178C) does not fit well in the qualification process. Since tools are often developed to be utilized in multiple operational environments, identification of each operational environment is important.

Difference 12: DO-330 gives more flexibility for structural coverage approaches and focuses on identifying unintended functionality. Structural coverage was probably the most controversial subject during the SC-205/WG-71 deliberations on DO-330. Modified condition/decision coverage (MC/DC) was originally not included in DO-330, since the tools themselves do not fly and historical data indicate that MC/DC has limited value for tools. Decision coverage and statement coverage were included. However, the MC/DC objective was added (in addition to decision coverage and statement coverage objectives) to obtain committee consensus. Per DO-330, statement coverage applies for TQL-3; statement and decision coverage apply for TQL-2; and statement coverage, decision coverage, and MC/DC apply for TQL-1. However, the wording in DO-330 emphasizes that the purpose of structural coverage is to identify unintended functionality; this opens the door for alternatives to MC/DC (as long as those alternatives meet the same intent as MC/DC). Another difference related to structural coverage is that DO-330 does not require sourceto-object code traceability analysis for the TQL-1 tool code as is required for DO-178C level A software.

Difference 13: DO-330 addresses interfaces with external components. External components are “components of the tool software that are outside the control of the developer of the tool. Examples include primitive functions provided by the operating system or compiler run-time library, or functions provided by a COTS or open source software library” [1]. DO-330 includes objectives and activities to carry out the following [1]:

  • Identify interfaces with external components in the design (DO-330 section 5.2.2.2.g).

  • Verify the correctness and completeness of the interfaces (DO-330 section 6.1.3.3.e).

  • Ensure correct integration of interfaces to external components (DO-330 section 6.1.3.5).

  • Ensure that interfaces to external components are exercised through requirements-based testing (DO-330 section 6.1.4.3.2).

Difference 14: DO-330 requires a tool installation report. The tool installation report is generated during the operational integration phase to confirm the tool’s integration into its operational environment. The installation report identifies (1) the configuration of the operational environment, (2) the version of the tool’s executable object code and any supporting configuration files, (3) any external components, and (4) how to execute the tool [1].

Difference 15: DO-330 life cycle data are tool-specific. Most of the tool life cycle data required by DO-330 are similar to what is required by DO-178C for airborne software or DO-278A for CNS/ATM software. However, the titles are tool-specific and the recommended content is focused on the needs of the tool qualification. DO-330 section 10 describes the tool life cycle data and DO-330 Annex A objectives tables identify what data are required for each TQL. DO-330 section 10 divides the data into three categories, each identified in a different subsection. The three categories and the data for each category are shown in Table 13.8.

Difference 16: DO-330 objectives tables vary some from DO-178C objectives. The objectives tables in DO-330 Annex A are similar to the ones in DO-178C Annex A. However, there are some differences. One notable difference is that the DO-330 Annex A tables are numbered as T-x, rather than A-x to distinguish them from DO-178C. Another general difference is that DO-330 has 11 Annex A tables instead of 10 Annex A tables like DO-178C. The titles of the DO-330 Annex A tables are listed here, followed by a summary of the primary differences between the DO-330 and DO-178C Annex A tables:

  • Table T-0: Tool operational processes

  • Table T-1: Tool planning process

  • Table T-2: Tool development processes

Table 13.8 Tool Qualification Data in DO-330

Images

  • Table T-3: Verification of outputs of tool requirements processes

  • Table T-4: Verification of outputs of tool design process

  • Table T-5: Verification of outputs of tool coding and integration processes

  • Table T-6: Testing of outputs of integration process

  • Table T-7: Verification of outputs of tool testing

  • Table T-8: Tool configuration management process

  • Table T-9: Tool quality assurance process

  • Table T-10: Tool qualification liaison process

Table T-0 is a DO-330 tool-specific table that does not have a DO-178C equivalent. It includes seven objectives for four tool processes to address the tool operation (the user’s perspective). The processes and objectives are summarized in Table 13.9.

DO-330 Tables T-1 through T-10 are similar to DO-178C Tables A-1 through A-9, with the following significant exceptions:

  • DO-330 Table T-2 objective 8 states: “Tool is installed in the tool verification environment(s)” [1]. This objective is unique to tools, since the tool verification environment may differ from the operational environment. Additionally, if there are multiple verification environments, the tool will need to be integrated into each environment during the qualification effort.

  • DO-330 Table T-3 objective 3 states: “Requirements for compatibility with the tool operational environment are defined” [1]. This objective focuses on operational environment compatibility instead of target computer environment, as previously noted in Difference #11.

    Table 13.9 Summary of DO-330 Table T-0 Processes and Objectives

    Table T-0 Process Table T-0 Objective(s)

    Planning

    Tool Operational Requirements development

    Tool operational validation and verification

    Tool operational integration

    1. “The tool qualification need is established.”

    2. “Tool Operational Requirements are defined.”

    3. “Tool Executable Object Code is installed in the tool operational environment.”

    4. “Tool Operational Requirements are complete, accurate, verifiable, and consistent.”

    5. “Tool operation complies with the Tool Operational Requirements.”

    6. “Tool Operational Requirements are sufficient and correct.”

    7. “Software life cycle process needs are met by the tool.”

    Source: RTCA DO-330, Software Tool Qualification Considerations, RTCA, Inc., Washington, DC, December 2011.

  • DO-330 Table T-3 objective 4 states: “Tool requirements define the behavior of the tool in response to error conditions” [1]. This objective is unique for tools and focuses on ensuring the tools are robust and respond predictably to errors.

  • DO-330 Table T-3 objective 5 states: “Tool Requirements define user instructions and error messages” [1]. Again, this is a tool-specific objective. It ensures that the tool has considered the user’s perspective.

  • DO-330 Table T-4 objective 11 states: “External component interface is correctly and completely defined” [1]. This tool-specific objective is intended to verify the external component interface that was previously discussed; see Difference #13.

  • DO-330 Table T-7 objective 5 states: “Analysis of requirements-based testing of external components is achieved” [1]. There is no equivalent objective in DO-178C. As noted in Difference #13 earlier, this objective is intended to confirm that the requirements-based testing has exercised the tool’s interfaces to external components.

  • DO-330 Table T-10 (all objectives) focuses on tool qualification rather than on certification.

Difference 17: TQL-5 in DO-330 clarifies what was required for verification tools in DO-178B. DO-330 is intended to require the same amount of rigor for qualification of TQL-5 tools as was required for verification tools in DO-178B. However, the DO-330 criteria clarify the DO-178B expectations in order to ensure compliance and to support tool reuse. DO-330 frequently asked question (FAQ) D.6 summarizes the DO-178B verification tool to DO-330 TQL-5 differences this way:

This document [DO-330] provides more accurate and complete guidance for tools at TQL-5 than DO-178B (and hence DO-278) did for verification tools. The intent is not to ask for more activities or more data (for example, the qualification does not require any data from the tool development process). However, it clarifies the content of the TOR, the compliance of the tool to the airborne (or CNS/ATM) software process needs, and the objectives of other integral processes applicable for TQL-5 [1].*

Some of the clarifications in DO-330 for TQL-5 that were not present for DO-178B’s verification tool criteria are summarized as follows:

  • DO-330 identifies specific objectives for TQL-5 (and all TQLs). This was not the case in DO-178B (i.e., there were no objectives for verification tool qualification in DO-178B).

  • DO-330 provides additional clarification of what is expected in the TORs.

  • DO-330 separates the TORs from the tool requirements.

  • DO-330 adds an integration objective to ensure the qualification occurs within a specific operational environment.

  • DO-330 includes objectives to ensure validation of the TORs.

Difference 18: DO-330 contains FAQs in appendices C and D. FAQs and discussion papers for DO-178C are included in DO-248C. However, tool-related FAQs are included in the DO-330 appendices. Appendix C includes domain independent FAQs. Appendix D provides FAQs specific to the DO-178C and DO-278A domains. Table 13.10 briefly describes each FAQ.

13.4 Special Tool Qualification Topics

There are a few special topics related to DO-330 that deserve some additional explanation.

13.4.1 FAA Order 8110.49

Chapter 9 of FAA Order 8110.49 includes some clarification of DO-178B regarding tool qualification. It is anticipated that this section of the Order will be deleted (or significantly modified) since DO-178C and DO-330 provide more comprehensive guidance than DO-178B did for tool qualification. However, there may be some additional guidance added to explain transition from DO-178B to DO-178C tool qualification criteria. Additionally, guidance for how to use DO-331, DO-332, and DO-333 supplements with DO-330 may be needed.

13.4.2 Tool Determinism

DO-178B section 12.2 included the following statement: “Only deterministic tools may be qualified, that is, tools which produce the same output for the same input data when operating in the same environment” [5].

Order 8110.49 clarifies what is meant by deterministic tools. Section 9-6.d of the Order explains that determinism is often interpreted too restrictively for tools. The Order states:

A restrictive interpretation [of determinism] is that the same apparent input necessarily leads to exactly the same output. However, a more accurate interpretation of determinism for tools is that the ability to determine correctness of the output from the tool is established. If it can be shown that all possible variations of the output from some given input are correct under any appropriate verification of that output, then the tool should be considered deterministic for the purposes of tool qualification. This results in a bounded problem. This interpretation of determinism should apply to all tools whose output may vary beyond the control of the user, but where that variation does not adversely affect the intended use (e.g., the functionality) of the output and the case for the correctness of the output is presented. However, this interpretation of determinism does not apply to tools that have an effect on the final executable image embedded into the airborne system. The generation of the final executable image should meet the restrictive interpretation of determinism [6].*

Table 13.10 Summary of DO-330 FAQs (in Appendices C and D)

Images

Images

DO-330 has the same intent as DO-178B and Order 8110.49; however, the words deterministic and determinism are avoided because they carry specific meaning in software engineering that may be overly restrictive for tools which do not fly. Instead, the following statement was made to explain the expectations:

During the qualification effort, the output of all qualified tool functions should be shown to be correct using the objectives of this document [DO-330]. For a tool whose output may vary within expectations, it should be shown that the variation does not adversely affect the intended use of the output and that the correctness of the output can be established. However, for a tool whose output is part of the software and thus could insert an error, it should be demonstrated that qualified tool functions produce the same output for the same input data when operating in the same environment [1].*

13.4.3 Additional Tool Qualification Considerations

DO-330 section 11 includes guidance for several additional considerations that may be encountered when developing or using qualified tools. The additional considerations are briefly summarized here.

Additional Consideration 1: Multifunction tools: The name is self-descriptive: a multifunction tool is a single tool that performs multiple functions. The multiple functions may be within a single executable file, multiple executable files (which allows disabling of certain functions), or some other arrangement that allows selection or disabling of some tool functionality [1]. The DO-330 section 11.1 guidance explains that multifunction tool(s) should be explained in the plans and that a TQL needs to be established for each function. If there is a mixture of TQLs, the functions either need to be developed to the highest TQL or separated to ensure that lower TQL functions don’t impact higher TQL functions (i.e., protection). If functions will not be used, they need to be disabled and have assurance that the disabling mechanism is sufficient to protect the enabled functions from the disabled functions. If unused functions cannot be disabled, they need to be developed to the appropriate TQL (usually the highest TQL of the tool). For tools that have a role in development and verification processes, the independence of tool development functions and verification functions must be considered (e.g., they might be developed by independent teams).

Additional Consideration 2: Previously qualified tools: DO-330 section 11.2 provides guidance for reusing tools that were previously qualified. It considers reuse in the following scenarios [1]:

  • The tool and its operational environment are unchanged.

  • The tool is unchanged but will be installed in a different operational environment.

  • The tool is changed but will be installed in the same operational environment.

  • The tool is changed and will be installed in a different operational environment.

The potential for reuse is significantly improved when a tool is designed and packaged to be reusable. DO-330 FAQ C.3 provides some suggestions for how to develop and package a tool for maximum reuse. Reusability does usually require more planning, robustness, and testing, but it can save significant resources down the road.

Additional Consideration 3: Qualifying COTS tools: DO-330 section 11.3 provides guidance to successfully qualify a COTS tool. COTS tools at TQL-5 are relatively easy to qualify because no insight into the tool development is needed. However, TQL-1–TQL-4 tools require insight into the tool development itself. Essentially, COTS tools still need to comply with the DO-330 objectives.

DO-330 section 11.3 explains what is typically expected of the tool developer and the tool user when qualifying a COTS tool. It is important to remember that tools are qualified in conjunction with the DO-178C software approval; that is, tool qualification is not a stand-alone approval. DO-330 attempts to make tool qualification as stand-alone as possible, but the actual qualification is only received in the context of a certification project. For COTS tools, DO-330 section 11.3 explains what objectives the tool developer is typically responsible for, as well as some suggestions for how to package the life cycle data so that it can be more easily rolled into the tool user’s project. Suggestions include the development of a developer-TOR, developer-TQP, developer-Tool Configuration Index, and developer-Tool Accomplishment Summary, which the tool user can evaluate and either use directly or reference from their own tool qualification data. The concept is that the tool developer anticipates the user’s needs and proactively develops the tool qualification data to meet those needs.

Additional Consideration 4: Tool service history: DO-330 section 11.4 explains the process if one decides to use service history for a tool. Service history may be feasible for a tool with considerable in-service experience but that has some missing life cycle data. Service history may also be used to raise the TQL of a tool, supplement the tool qualification data, or increase confidence in TOR compliance. As with service history for airborne software (see Chapter 24), making a service history case is quite challenging and will likely be met with skepticism from the certification authority.

Additional Consideration 5: Alternative methods for tool qualification: DO-330 section 11.5 explains that an applicant may propose alternate methods to those described in DO-330. Any alternative must be explained in the plans, thoroughly justified, assessed for impact on the resulting software’s life cycle processes and life cycle data, and coordinated with the certification authority.

13.4.4 Tool Qualification Pitfalls

Having assessed dozens of tools over the years, I’ve observed some common pitfalls. Each project differs and some of these challenges are more prevalent than others. However, they are provided here for your awareness. To be forewarned is to be forearmed.

Pitfall 1: Missing user instructions. Tools are designed to be used, typically by a team that did not develop the tool. However, user instructions are often missing or are inadequate. This makes it uncertain if the tool will be used as intended.

Pitfall 2: Tool version not specified. Sometimes, there are multiple versions of a tool. It must be clearly identified what version of the tool is being used. This is typically documented in the SLECI or Software Configuration Index. It should also be confirmed that the team is using the identified version.

Pitfall 3: Configuration data not included in tool qualification package. Many tools require some configuration data. For example, a tool that verifies conformance to Motor Industry Software Reliability Association (MISRA) C standard may only be activated to check a subset of the rules. The specific rules are enabled or disabled in a configuration file. The configuration file needs to be under configuration management and identified in the qualification data.

Pitfall 4: Need for tool qualification not accurately assessed. Occasionally, when reviewing a team’s data, I come across a tool that was not listed in the PSAC because the team didn’t think it needed to be qualified; however, it turns out that qualification is needed. Late discoveries of this sort lead to unplanned work. As explained in Chapter 5, to avoid this pitfall it is recommended that all tools be listed in the PSAC, along with a justification for why they do or do not need to be qualified. This allows the project and the certification authority to reach agreement early on, rather than late in the project.

Pitfall 5: TORs or tool requirements not verifiable. Some of the worst requirements I’ve encountered were tool requirements. Since TORs must be verified for all TQLs and tool requirements are verified for TQL-1–TQL-4, the requirements must be verifiable. The TORs and tool requirements should have the same quality attributes discussed in Chapters 2 and 6.

Pitfall 6: Incomplete COTS tool qualification data. Several COTS tool vendors offer tool qualification packages. Some of them are great, and some of them are not so great. Caution should be used when investing in a tool qualification package—consider getting some kind of guarantee along with the tool data.

Pitfall 7: Overreliance on tools without adequate understanding of tool functionality. Some software teams, particularly inexperienced teams, rely on the tools without really understanding why they are used, what they do, or how they fit into the overall software life cycle. Tools can be terrific additions to the engineering process. However, they must be understood. As one of my friends often says, “A fool with a tool is still a fool.” Users need to understand what the tool does for them.

Pitfall 8: PSAC and SAS do not explain the tool qualification approach. Oftentimes, the PSAC states that a tool needs to be qualified, but does not go beyond that. For TQL-1–TQL-4 tools, the PSAC should summarize the tool’s use and reference its TQP. For TQL-5 tools, the qualification approach should be explained in the PSAC or in a TQP. Likewise, the SAS frequently provides inadequate details about the tool and its qualification data. See Section 13.3.2, Difference #5, of this chapter for a summary of what tool information should go in the PSAC and SAS.

Pitfall 9: Team spends more time on the tool development than on the software that uses it. Many engineers enjoy developing tools; sometimes they get carried away and focus so much on the tool that they lose sight of the purpose for the tool.

Pitfall 10: The role of the tool(s) in the overall life cycle not explained in the plans. Oftentimes, the description of the development, verification, configuration management, and quality assurance processes fail to mention the tools that are used in those processes. The plan lists the tools in a tool section but does not explain how the tools are used in the life cycle phases (i.e., when the processes themselves are discussed). For example, a requirements management tool may be explained in a tool section but not mentioned in the requirements capture and verification sections of the plans. This makes it difficult to ensure that the tools are being properly utilized in the life cycle.

Pitfall 11: Qualification credit not clearly identified. When tools replace, automate, or eliminate DO-178C (and/or the supplements) objectives, it should be clear what objectives they impact. This should be specified in the PSAC, as well as the software development plan, software verification plan, and/or software configuration management plan where the tool’s use is explained. Additionally, for TQL-1–TQL-4 tools the TQP may explain this.

Pitfall 12: Tool operational environment not clearly identified. As mentioned earlier (Section 13.3.2, Differences #11 and #14), it is important to ensure that the operational environment assumed during the qualification is representative of the environment used in operation. This is commonly missed in the SLECI and plans.

Pitfall 13: Tool code not archived. Since tools are part of the software devel opment and verification environment, they should be archived along with other software life cycle data. For some tools that require special hardware, the hardware may also need to be archived.

13.4.5 DO-330 and DO-178C Supplements

The model-based development (DO-331), object-oriented technology (DO-332), and formal methods (DO-333) supplements apply DO-330 the same as DO-178C does. Basically, tool qualification is the same regardless of the technology that utilizes the tool. However, if a tool is implemented using model-based development, object-oriented technology, or formal methods, the technology supplements would likely need to be applied to the tool development and verification.

13.4.6 Using DO-330 for Other Domains

DO-330 was developed to be usable by multiple domains. If another domain (such as aeronautical databases or programmable hardware) chooses to use DO-330, they will need to explain how to determine the TQL for their domain, explain how to adapt software terminology in DO-330 for their domain, and clarify any domain-specific needs. DO-330 Appendix B provides an example of how the CNS/ATM domain implemented the DO-330 tool qualification document. Other domains could use a similar approach, or they could create their own approach.

References

1. RTCA DO-330, Software Tool Qualification Considerations (Washington, DC: RTCA, Inc., December 2011).

2. RTCA DO-178C, Software Considerations in Airborne Systems and Equipment Certification (Washington, DC: RTCA, Inc., December 2011).

3. RTCA/DO-254, Design Assurance Guidance for Airborne Electronic Hardware (Washington, DC: RTCA, Inc., April 2000).

4. RTCA/DO-200A, Standards for Processing Aeronautical Data (Washington, DC: RTCA, Inc., September 1998).

5. RTCA/DO-178B, Software Considerations in Airborne Systems and Equipment Certification (Washington, DC: RTCA, Inc., December 1992).

6. Federal Aviation Administration, Software Approval Guidelines, Order 8110.49 (Change 1, September 2011).

*Co-led by yours truly.

*The various levels of tool requirements should be closely coordinated with certification authority since there are divergent opinions on this topic.

*Brackets added for clarity.

*Brackets added for clarification.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset