CHAPTER 7:
VERIFY, VALIDATE & AUTHORIZE – CONDUCTING THE AUTHORIZATION

One must verify or expel his doubts, and convert them into the certainty of Yes or No.72

Thomas Carlyle, Scottish Essayist, Satirist, and Historian

In this chapter:

Assessing the security controls
Developing the plan of action and milestones
Authorizing the information system operation

The previous phase ended with the implementation of a set of security controls as defined in the system security plan. This phase begins with a review of the initial SSP and the independent assessment of the security controls and ends with a risk-based decision to either authorize or deny the operation of an information system.

Figure 10: C&A process 2

During the implementation activities in the previous phase, evidence and artifacts were collected to support the authorization decision process. These artifacts will be reviewed and tested during this phase to determine if they meet the published compliance standards. In addition, actual testing of the security characteristics of the information system will be conducted.

ASSESS the security controls

In this chapter, you will gain the tools you need to prepare to verify and validate the operational, managerial, and technical controls within your risk or accreditation boundary. You will be introduced to a high level planning process that can be used in any security control testing environment, and then provide a methodology for the actual security control testing. Finally, the difference between several commercial and government types of control testing will be illustrated.

As the organization enters this phase, a determination should be made whether the information system is ready to be evaluated and tested. This decision will be made based on the results of the validation testing, also known simply as security testing.

The verification and validation activities should be tailored to the system life cycle activities to ensure that the tests are relevant and provide the required degree of rigor – but do not test in excess of the system’s current life cycle requirements. If a significant period of time has passed since the completion of the activities in the preview phase, or if new individuals are involved in the authorization process, the SSP should be reviewed to determine if the details about the system are still valid.

What is security control testing?

Validation testing, information systems security control testing and information assurance control testing are essentially synonymous for the same action – to certify that the required security controls have been correctly and completely implemented. It includes the examination and analysis of both technical and nontechnical security safeguards of IT resources as they have been applied to the information system.

This process, hereafter referred to security control testing, requires careful planning and a well trained staff to ensure it is executed correctly. It can be arguably one of the most critical steps in the authorization process. Why? There are essentially two reasons:

On a general level, emphasis is being placed by the government on security control testing. We see this emphasis on the testing and validation of security controls expanding into the private sector, with evidence of the implementation of adequate security controls being demanded by federal and DOD agencies, investors, customers, and eventually and increasingly, the rest of the Internet community. This evolution is analogous to the implementation of mandatory safety inspections of the cars driven on our highways. It is becoming essential that we demonstrate that the mechanisms that are integrated into the information superhighway are built and operated to security standards.

On a level more specific to the authorization process, security control testing provides the authorizing official (AO) or designated accrediting authority (DAA) with the information needed to make an informed risk-based decision on the authorization for operation or, in some cases, the denial of operation for a system. This includes:

• evidence about the effectiveness of security controls in organizational information systems;

• an indication of the quality of the risk management processes employed within the organization; and

• information about the strengths and weaknesses of information systems supporting critical federal missions and applications in a global environment of sophisticated threats. (NIST SP 800-53A)

There are essentially two types of security control testing: self-assessment, which we will discuss in more detail in Chapter 8, and testing by an expert third party. Expert security control testing specifically addresses security control test methods requiring a high level of technical skill and the use of professional tools and methodologies to determine the security profile of the information system. By contrast, self-assessments are more often characterized by a less technical approach to testing the security controls, but usually involve a more direct and complete knowledge of the target information system(s).

Now that you have an idea about what security control testing really is, let’s tell you what it is not. Security control testing is not about filling out a checklist or passing an audit or even just getting the information you need to complete the authorization documentation. It is all about developing a credible and significant input to the AO’s risk-based decision process.

The primary objectives of security control testing are to:

discover design, implementation, and operational flaws that might affect the confidentiality, integrity, and availability of the information and information systems;

determine the adequacy of security mechanisms, assurances, and other properties to enforce the organization’s security policy; and

assess the degree of consistency between the information system security documentation and its implementation.

What should be tested?

Ideally, security control testing should be performed on all hardware and software components to ensure that all baseline security controls are adequately addressed. However, the scope and depth of security control assessments should always be risk-driven.

The security control assessor begins by reviewing the security controls described in the security plan and the purpose of the assessment. A security control assessment could range from a complete testing of all security controls associated with the information system (e.g. during security testing conducted as part of the initial authorization) or a limited assessment of specific security controls in the information system (e.g. during continuous monitoring, post accreditation, or where subsets of the controls in the information system are assessed on an ongoing basis).

When conducting a more limited assessment, the information system owner should coordinate with all of the stakeholders in the security of the information system (e.g. senior agency information security officer, mission/information owners, and/or authorizing official). Together, they should agree which security controls from the security plan should be assessed.

Selection of the security controls may also depend on the continuous monitoring schedule established by the information system owner. All security controls should be reviewed and/or tested at least once during the three-year accreditation cycle. Weaknesses listed on the plan of action and milestones must also be given adequate oversight, and controls with greater volatility should be identified and assessed more frequently.

Who executes security control testing?

The quality of the authorization process is influenced by the quality of the security control testing. Consequently, it is required that the verification and validation of the security controls be performed by an independent, objective – and most importantly – qualified third party. This is particularly important when seeking an initial authorization to operate. Once the authorization has been obtained, self-assessments and internal security control testing may be sufficient.

In this section, we will focus on the more comprehensive, independent, hands-on testing. Self-assessments will be covered in additional detail in Chapter 8, where the focus is placed on maintaining the authorization to operate.

The selection of a qualified security control assessor – whether from an internal team or an independent provider – is a critical decision for organizations seeking authorization to operate an information system. Security control assessment providers should be qualified and able to provide effective and efficient assessments; this will provide sufficiently reliable information to make decisions for authorizing information system operation.

An important note in this time of constrained personnel resources: If the security control assessor also provides other security services to the organization, they may also provide security control assessment services as long as there is an adequate segregation of responsibilities and accountabilities. The organization’s management should ensure that different individuals are assigned to provide oversight and other security services separate from the security control assessment. They should also ensure that the personnel are not involved in any authorization to operate decision making, and ensure there is no possibility of influencing the outcome of the assessment.

Validation testing in federal agencies

Federal agencies may have internal security control test teams who are responsible for the initial testing associated with the authorization process and for the testing required by continuous monitoring. They may also contract for independent testing through one of the many government consulting companies specializing in security controls testing.

Validation testing within DOD

While “highly recommended” for federal agencies, independent testing is a DOD mandate. Each of the DOD services has established an independent testing authority, usually reporting to the CA. In the Air Force, the certifying authority is located within the Headquarters, Air Force Communications Agency (AFCA). The Department of the Army certifying authority is centralized in the Office of the CIO/G-6, Senior Information Assurance Officer. Finally, the function of certifying authority for the Department of the Navy is assigned to the Space and Naval Warfare Systems Command (SPAWAR SYSCOM). As required, DOD also augments their security control test team with external contract support.

It is important to note that the testing and validation of security control assessment may differ depending on its status in the authorization process and its life cycle status. Prior to authorization the emphasis is most frequently placed on security control testing by a third-party expert – whether in the federal agencies or in the DOD. After authorization is granted, however, the security control assessments are specifically targeted to ensure continuity of the system security status. Frequently, the security control tests become the responsibility of the system owner, but may also be conducted by third-party experts, internal reviewers or a combination of the two.

Security control test procedures

Security control testing can only meet the above objectives if it is a consistent and standardized process. In order to meet this ideal, both the federal government and the DOD have provided standardized guidance. This guidance is equally applicable for use in conducting the independent test and the self-assessment.

For federal agencies, this guidance can be found in NIST Special Publication 800-53A, Guide for Assessing the Security Controls in Federal Information Systems, July 2008. Below is an example of part of a validation test from this guidance document. NOTE: This is not the entire validation test, only a subset which is used here to provide an example of the comprehensiveness of the security control validation test instructions.

Table 17: Sample validation test

Rather than issuing a written publication, the DOD provides its standardized validation test guidance on the DIACAP Knowledge Service website at https://diacap.iaportal.navy.mil/. Here is an example of a DOD validation test. Different from the NIST SP 800-53, DOD provides a description of the steps required to prepare for the test, the execution tests, and the expected results.

Table 18: DOD validation test

Security control assessment methods

Whether conducting a validation test for federal agencies or the DOD, the methods are essentially the same: examine, interview, and test. Most security controls require a combination of these methods in order to obtain the most comprehensive results.

Let’s look at these a little more closely.

Examine – “E”

Specific artifacts are usually the target of examination. The examine method is used to facilitate assessor understanding, achieve clarification, or obtain evidence.

This method may be used to observe a managerial or operational situation or possibly review a technical configuration. Examinations consist of checking, inspecting, reviewing, observing, studying, or analyzing one or more assessment objects to facilitate understanding, achieve clarification, or obtain evidence, the results of which are used to support the determination of security control effectiveness.

Typical actions might include the examination or review of information security policies, plans, and procedures; system design documentation and interface specifications; system backup operations, reviewing and analyzing the results of contingency plan exercises or drills; incident response operations or activities; security configuration settings; or technical manuals and user/administrator guides.

For example, you might examine the rules of behavior to ensure users understand their roles and responsibilities for system access. You would ascertain that all users had signed rules of behavior and that they were stored in the training file.

You might also use the examine method for testing system configurations, particularly where automated means are not applicable or possible. For example, you may examine a system configuration to ensure that the audit log is configured correctly. Then you would examine the actual logs to ensure that all actions that needed to be logged in an application were reflected in the logs.

To reduce the level of effort in examining assessment objects, assessors should, to the maximum extent possible, reuse examination results and evidence from previous assessments. This is useful only in those cases when the results are available, when there have been no substantial security-relevant changes to the information system that could invalidate the results, and when the results are determined to still be credible.

There are three levels of examinations:

Generalized: Brief, high-level reviews, observations, or inspections of security controls using a limited body of evidence or documentation. These are typically conducted using functional-level descriptions of specifications, mechanisms, or activities.

Specific: Detailed analyses of security controls using a substantial body of evidence or documentation. These are typically conducted using functional-level descriptions of specifications, mechanisms, or activities, and where appropriate, high-level design information.

Comprehensive: Detailed and thorough analyses of security controls using an extensive body of evidence or documentation. These types of examinations are usually conducted using functional-level descriptions of specifications, mechanisms, or activities, and where appropriate, high-level design, low-level design, and implementation-related information (e.g. source code).

Interview – “I”

Individuals, or groups of individuals, are usually the target of the interview process. Interviews are usually intended to facilitate understanding, achieve clarification, or lead to the location of evidence, the results of which can be used to support the level of security control implementation and a determination of security control compliance.

Interviews can be conducted with an array of individuals, to include agency heads, chief information officers, senior agency information security officers, authorizing officials, information owners, information system owners, information system security officers, information system security managers, personnel officers, human resource managers, facilities managers, training officers, information system operators, network and system administrators, site managers, physical security officers, and users.

There are essentially three levels of interviews:

Generalized: Broad, high-level discussions with selected organizational personnel on particular topics relating to the security controls being assessed. This type of interview is most often used with a set of general, high-level questions and is intended to obtain a broad, general understanding of the fundamental concepts associated with the security controls.

Specific: Broad, high-level discussions combined with more detailed discussions in specific areas with selected organizational personnel. This type of interview is typically conducted using a set of general, high-level questions together with a set of more detailed questions in specific areas where responses indicate a need for more detailed investigation. Focused interviews are intended to capture the specific understanding of the fundamental concepts associated with the security controls.

Comprehensive: Broad, high-level discussions and more detailed, probing discussions in specific areas with selected organizational personnel on particular topics relating to the security controls being assessed (including the results of other assessment methods). This type of interview is typically conducted using a set of general, high-level questions together with a set of more detailed, probing questions in specific areas. Comprehensive interviews are used where there is a need for more detailed investigation or where assessment evidence allows and is intended to capture the specific understanding of the fundamental concepts and implementation details associated with the security controls.

Test – “T”

Testing is most often focused on the technical or operational controls. Security control testing occurs under specified conditions and compares actual with expected behavior, the results of which are used to support the determination of security control effectiveness.

Testing is the process of actually exercising one or more assessment objects (i.e. activities or mechanisms) under specified conditions to compare actual with expected behavior. Do not confuse a test with an observation. If you are testing the system to ensure it locks out accounts after three invalid attempts, this requires you to actually try to logon four times (three with the incorrect credential and one with the correct credentials). It does not mean you examine the system configuration documentation and note that the system is set to lockout users after three invalid attempts.

Typical tests include: structural testing of the logical access control and encryption mechanisms; functional testing of the identification/authentication and audit mechanisms; functional testing of the security configuration settings; functional testing of the physical access control devices; penetration testing of the information system and its key components; functional testing of the information system backup operations; and functional testing of the incident response/contingency planning capability.

To reduce the level of effort in testing assessment objects, the assessor should, to the maximum extent possible, reuse test results and evidence from previous security control assessments. This is acceptable when such results are available, there have been no substantive intervening changes to the information system that could invalidate earlier results, and the results are judged to be credible.

There are essentially three depth attributes to testing:

Generalized testing: A test methodology that assumes no knowledge of the internal structure and implementation detail of the assessment object. This type of testing is conducted using a functional specification for mechanisms and a high-level process description for activities. Generalized testing provides a level of understanding of the security control necessary for determining whether the control is implemented and free from obvious errors. Also known as “black box” testing.

Focused testing: Test methodology (also known as “gray box” testing) that assumes some knowledge of the internal structure and implementation detail of the assessment object. This type of testing is conducted using a functional specification and limited system architectural information (e.g. high-level design) for mechanisms and a high-level process description and high-level description of integration into the operational environment for activities. Focused testing provides a level of understanding of the security control necessary for determining whether the control is implemented and free from obvious errors and whether there are increased grounds for confidence that the control is implemented correctly and operating as intended.

Detailed testing: Test methodology (also known as “white box” testing) that assumes explicit and substantial knowledge of the internal structure and implementation detail of the assessment object.

White box testing is performed based on having knowledge of how the system has been implemented. It includes the analysis of data flow, control flow, information flow, coding practices, and exception and error handling within the system, to test the intended and unintended software behavior. White box testing can validate whether code has been implemented according to the intended design, security functionality has been integrated into the system, and to uncover exploitable vulnerabilities. White box testing does require access to the source code. Although white box testing can be performed any time in the life cycle after the code is developed, it is a good practice to perform this type of testing during early testing phases.

Detailed testing provides a level of understanding of the security control necessary for determining whether the control is implemented and free from obvious errors and whether there are further increased grounds for confidence that the control is implemented correctly and operating as intended on an ongoing and consistent basis, and that there is support for continuous improvement in the effectiveness of the control.

Observation – “O”

In addition to the above formal assessment methods, “random” observation can also play a crucial role in the validation process. Observation is often used in two ways:

The security controls assessor/tester observes the performance of a specific procedure or set of activities.

The security controls assessor/tester observes an unintended security event during the course of the validation testing that affects one or more of the security controls.

As indicated in each of the assessment methods described above, there is also a set of associated attributes, depth and coverage, which help define the expected level of effort for the assessment. These attributes are hierarchical in nature, providing the means to define the rigor and scope of the assessment for the increased assurance needed for higher impact level information systems.

The depth attribute addresses the rigor of and level of detail in the examination, interview, and testing processes. The values for the depth attribute for examination and interview include generalized, focused, and detailed. The depth values for testing include functional, penetration, and structural testing.

An additional attribute is “coverage.” The coverage attribute addresses the scope or breadth of the examination, interview, and testing processes, including the number and type of specifications, mechanisms, and activities to be examined or tested and the number and types of individuals to be interviewed. Values for the coverage attribute include representative, specific, and comprehensive.

The appropriate depth and coverage attribute values for a particular assessment method reflect the values needed to achieve the assessment expectations. This is determined by the characteristics of the information system being assessed (including impact level) and the specific security determinations that need to be made.

Executing the security controls assessment

Successful security control assessments or testing can be broken down into the following steps.

Figure 11: Approach to security control planning

Plan the security controls assessment

The importance of planning as it relates to preparing for an IA control test (sometimes referred to as IA control validation, security test and evaluation (ST&E) and security controls assessment (SCA)) cannot be stressed enough. IA control testing can become a very time consuming, cumbersome, and, all too often, an extremely expensive process for an organization – yet it is too critical to ignore. If the time is taken to plan each IA control test, the cost of testing can be more efficiently managed, it will be more effective, and the results will be more beneficial to the organization.

1. Determine the security controls to be tested: Chapter 6 defined the steps for security categorization of your information system and the resulting assignment of the security controls essential to protect your information and information system. You also identified any supplementary or compensatory controls required by specific information system security requirements, such as privacy requirements related to medical information.

This establishes your security controls baseline and the list of all controls that must be verified during the security controls assessment process. But not all of these security controls must be tested!

In order to determine the final set of security controls that must be tested, the security controls tester should identify:

Common and inherited security controls

Non-applicable security controls

Security controls that cannot be tested (e.g. as a result of the life cycle status).

The security controls assessor should note which IA controls (or parts thereof) can be designated as common or inherited. The common or inherited controls may have been previously implemented and tested as part of the organization’s enterprise-wide information security program, or there may be a separate plan to assess the inherited controls. Common or inherited security controls can apply to multiple information systems within an organization and the protection measures provided by those controls are inherited by the individual systems under assessment.

The organization must ensure that the impact level associated with the identified common or inherited controls and the level of rigor and intensity of the security control assessments are equivalent to the impact level of the information system(s) inheriting the security controls. In general, the impact level of the common or inherited controls should be the same or higher than the information system being tested.

These are the controls that you do not have to test again, because they are provided as part of the larger system type or environment, for example, if you have several systems in an approved facility (enclave, GSS, SCIF), you will not need to test the access control on the computer room door for each system. Maximizing the use of common controls will reduce costs, make testing more effective, improving the consistency of the testing and results, and reinforcing reciprocity.

Once identified, these security controls can be marked as “compliant” based on the existing test results – without requiring additional testing. If the assessment results are not available for the common or inherited security controls, the assessment cannot be considered complete until the assessment results for the common or inherited controls are either made available or these security controls are included in the test plan.

It is important to coordinate this process with appropriate organizational officials (e.g. chief information officer, senior agency information security officer, authorizing official, information system owner). They may be able to obtain the results of common and inherited security control assessments or (if the inheritable security controls have not been assessed or are due to be reassessed) to make the necessary arrangements to include these controls in the current validation.

Hybrid controls may also be identified. These are security controls that are partially common or inherited controls and partially the responsibility of the information system. CP-2, contingency planning, is an example of a hybrid control. There will generally be a master contingency plan developed by the organization for all organizational information systems. However, individual information system owners may adjust, tailor, or supplement the contingency plan with system-specific aspects of the plan. For each hybrid security control, system-specific assessment procedures must be executed and the results retained along with the results from common control assessments. This way, all aspects of the security control will be assessed.

There may also be security controls that are simply not applicable to the information system being tested. For example, a standalone local area network may not require testing of the security controls for interconnectivity to the Internet. If your system does not have voice over IP (VOIP) – take it off the test list. These security controls can be marked as not applicable or NA. These may not require validation at this time or at all. Again, this should be coordinated with appropriate organizational officials (e.g. chief information officer, senior agency information security officer, authorizing official, information system owner). NOTE: In the DOD, security controls marked as NA must be indicated as such in the plan of action and milestones (POA&M).

Next, the IA controls which cannot be tested – based on the system configuration, life cycle status, or type – should be identified as not requiring validation at this time or at all. As with security controls identified as NA, these must be indicated on the POA&M.

Finally, conduct a gap analysis to determine the delta between the security controls, which are common or inherited, non-applicable or not tested, and those which remain valid for the information system. These remaining security controls will form the baseline for the validation test.

2. Select the security test procedures: NIST Special Publication 800-53A for federal information systems and the DIACAP Knowledge Service for the DOD provide comprehensive descriptions of test procedures for each security control in the respective baseline. As a starting point, security control assessors should refer to these provided baseline procedures for each of the security controls remaining after completion of the gap analysis in Step 1 above.

Using these procedures as a starting point, the security control assessors can select the appropriate procedural steps within the procedure. The number and complexity of test procedures may increase based on the impact level determined for the information system. The higher the impact level, the greater the rigor and intensity of the implementation and validation process must be. There is an increased level of assurance required in the effectiveness of the security controls being assessed.

It should be noted during the tailoring process of the initial security control baseline that supplementary security controls may have been added and/or the organization may have decided to employ compensating controls. The assessment procedures and associated procedural steps should be adjusted accordingly to reflect these changes.

Organizations should also be aware that certain changes to the security controls (i.e. adding and/or deleting controls or control enhancements from the security plan) may also affect other controls assigned to the information system. This will, in turn, affect the selection of test procedures and procedural steps required to assess the effectiveness of those controls and control enhancements.

In these situations, security control assessors should use the test framework provided by NIST or found in the DIACAP Knowledge Service to develop tailored test procedures for those supplementary or compensatory security controls. The additional assessment procedures should be integrated into the final security controls test plan.

In addition to the development of new or tailored test procedures, the procedures in the NIST guidance or in the DIACAP Knowledge Service may be extended or adapted to address platform-specific or organization-specific dependencies. If these modifications are made, these must be also noted in the final test.

This situation arises most often in the test procedures associated with the technical security controls (e.g. access control, identification and authentication, etc.). Detailed test scripts may need to be developed for the specific operating system, network component, middleware, or application employed within the information system to adequately validate certain characteristics of a particular security control. These test scripts can be considered extensions of the NIST and/or DOD-provided test procedures.

During the actual testing of an information system, the test procedures may be applied numerous times to a variety of assessment objects within a particular family of security controls. To save time, reduce costs, and maximize the usefulness of the test results, security control assessors should review the selected test procedures and look for opportunities to combine or consolidate procedural steps. For example, the security control assessors may consolidate interviews with key organizational officials to cover a variety of security-related topics.

It is important to realize that the procedures provided by NIST and DOD may not be complete and certainly may not address all of the possible configurations of an information system. Consequently, the security controls assessor should be sufficiently trained and experienced in order to be able to identify those situations where the provided test procedures are not adequate. An assessor should not blindly follow the procedure without taking time to ensure that the procedure is relevant for the information system and configuration being tested.

For example, organizations often integrate new, emerging technologies that have not yet been integrated into the existing test procedures. These may include new types of applications, the addition of cross domain technologies, and the use of emerging wireless capabilities.

Once the final list of security controls to be tested and the test procedures have been identified, selected – and possibly tailored – you should develop a written test plan.

3. Prepare the security control test tools and gather information: In order to be as time and resource efficient as possible, the security controls assessor should prepare the necessary test tools and gather as much of the required information as possible prior to initiating the actual validation tests.

Testing can be executed manually or by using automated test tools. Manual methods are used most often with people and processes. Automated tools can be very useful in conducting tests of technical security controls.

There is no lack of automated tools and checklists for use in conducting security control tests and evaluations. These tools can facilitate the testing process, but security controls testers should also realize the limitations of these tools. Most security controls assessors will use multiple tools, including commercial and open source tools. Tools generally fall into the following categories:

Information-gathering tools and techniques

Scanning tools

Enumeration tools

Wireless tools

Password auditing tools

Vulnerability scanning tools

Automated exploit tools.

The main reason for selecting a range of tools and methods is that each tool may only detect some percentage of existing vulnerabilities and, at the same time, potentially generate a number of false positives. Most tools excel in certain types of scenarios, however, a single tool may find less than 50% of known vulnerabilities in a typical information system, at least based on these authors’ experience.

Vulnerability assessment products (also known as scanners) are used to support rigorous examinations of information systems in order to determine weaknesses that might allow exploitation by a threat agent. These products generally follow the following steps in executing a vulnerability analysis:

First, passive, host-based mechanisms inspect system configuration files for unwise settings, system password files for weak passwords, and other system objects for security policy violations.

Next, these checks are followed, in most cases, by an active, network-based assessment, which reenacts common intrusion scripts, recording system responses to the scripts.

The results of vulnerability assessment tools are only able to present a snapshot of the information system security status at a point in time. Also, these systems cannot reliably detect an attack in progress; they may be able to determine that an attack is possible, and perhaps even identify that an attack has occurred.

Some tools may allow experienced users to augment the baseline product by adding tailored scripts or exploits. This can aid in increasing the number of vulnerabilities found, as well as reducing the false positives. Some may have unrealistic expectations about the capabilities of automated tools. They are not assessment “silver bullets” – so keep in mind that they:

do not compensate for weak identification and authentication mechanisms;

cannot conduct investigations of attacks without human intervention;

will not be able to assess the contents of your organizational security policy;

are not intended to compensate for weaknesses in network protocols;

cannot compensate for problems in the quality or integrity of information in the system; and

are not always current with modern network hardware and features.

As technology progresses and with increased emphasis from customers, such as the federal government, the sophistication of security test tools continues to improve. In the meantime, security control testers should focus on identifying the most critical requirements for the selected commercial tools and especially look for flexibility in extending the product as they prepare for their security control evaluations.

NIST and DOD have collaborated on a suite of tools and checklists to assist in vulnerability management and in the evaluation of the security controls. A list of these tools can be found at the Security Content Automation Protocol (SCAP) site, which is part of the National Vulnerability Database (NVD).

Figure 12: SCAP website

Federal agencies, as well as other organizations, can automate much of their technical security control test activities by using SCAP checklists and associated tools. The SCAP checklists have FISMA compliance mappings embedded within the checklists. SCAP-compatible tools, which are listed on the website, are often designed to automatically generate NIST SP 800-53 compatible security control assessment and compliance evidence. Each security control test or check is mapped to the appropriate high level NIST SP 800-53 security controls and, where appropriate, to the assessment procedures found in NIST SP 800-53A. In addition, the SCAP-compatible tools and checklists also contain mappings to other high level policies (e.g. ISO27000 series, DODI 8500.2) and SCAP tools may also output those compliance mappings.

Table 19: Things to look for in an assessment tool

The ability to keep the assessment tool up-to-date with the latest vulnerabilities easily, efficiently, and economically – environments change constantly and the assessment tool must be able to adjust as well.

A secure interface with critical database software such as Oracle, SQLserver or MS-Access, to further analyze security data and/or tie VA into the back office processes.

Integration with commonly used reporting tools to provide the ability to save, export, print, or email security reports.

Immediate availability of pre-formatted reports that can be customized to fit the needs of the organization.

The capability to intelligently assess security across multiple platforms (UNIX, NT, NetWare, and OpenVMS) and correlate that information into comprehensive report formats.

Requirement for only minimal resources such as network bandwidth – tools that execute on servers through agents typically require less bandwidth as only the report needs to be transmitted over the network.

Develop and/or select test scripts: Most of the controls you will test on a system are not technical. Actually, most of the controls that support an information system fall into the operations and management categories. Luckily, these controls usually have an extensive amount of information about validating these controls and (sometimes) what evidence is required for each control.

If you are testing an information system for a federal agency, you are required to follow NIST guidelines. You will find test procedures in NIST SP 800-53A for most of the operational, managerial, and even technical controls. If you are testing a DOD system under DIACAP, you will be using the DIACAP validation procedures provided on the DIACAP Knowledge Service. If you are testing a National Security System (NSS), then you will use the CNSS 1253A test procedures. For certification purposes, even after you have completed testing of all the baseline controls, you may still be required to perform system component and network assessments depending on the architecture of the information system.

Once you know all the components you are going to test on the system, you should develop/assign a test script (automated tool or manual check) to each component. For example, if you are testing a Windows® 2003 Server with IIS 7.0, and SQL Server 2005, then you will need a test script for each of the named components. If the components connect through the network, you will also want to include a network vulnerability scanner and possibly an application and database scanner.

If you are testing information systems within federal agencies or DOD, you will only be authorized to use certain standards, tools, and scripts for your testing. Be sure to check out the list of authorized tools as you are in the planning process.

You should always include the list of tools you are going to use to test the systems in the test plan, which should be pre-approved by the AO or DAA. Some examples of authorized/provided testing tools include:

Defense Information System Agency (DISA) Security Test and Implementation Guides (STIG), and NSA System and Network Analysis Center (SCAC) guides: configuration standards for DOD IT, IA and IA-enabled devices/systems.

A security checklist (sometimes referred to as a lockdown guide, hardening guide, or benchmark configuration) is essentially a document that contains instructions or procedures to verify compliance to a baseline level of security.

Security readiness review scripts (SRRs) test products for STIG compliance. SRR scripts are available for all operating systems and databases that have STIGs, and web servers using IIS. The SRR scripts are unlicensed tools developed by the Field Security Office (FSO) and the use of these tools on products is completely at the user’s own risk.

All DISA STIGS, checklists, and other test resources can be found at: http://iase.disa.mil/stigs/index.html .

All SRR scripts can be found at: http://iase.disa.mil/stigs/SRR/index.html .

Gather essential information and artifacts: Next to planning for the security controls testing, identifying, collecting and reviewing artifacts73 may be the most important step in the authorization process. Before you can properly conduct security control testing, you will need to have a very good understanding of the information system or component you are going to test. Based on the type of test you plan on conducting, it is important to compile all the information you can about the information system.

A good way of knowing what you will need to gather is to co-develop a checklist of the basic documents (or security artifacts) necessary to understand the mission of the organization, the compliance environment (to include the security categorization), and the specifics about the system or component you are going to test. At a minimum, you will need:

A comprehensive system description. At a minimum the system description should include:

• full system name and current version;

• a system identifier (OMB-300 or agency/component registration number);

• system or security categorization;

• data types (process, stored, transmitted, and managed);

73 The terms artifact and evidence are often used interchangeably. For the purposes of our text, we will use artifact.

• general description and system purpose;

• system environment.

System contact information, with at least:

• system owner/data owner (with related contact information);

• technical POC;

• information security POC; and,

• the system development life cycle (SDLC) or system life cycle (SLC) stage.

A data flow diagram: A data flow diagram provides an architectural view of the systems, how they interact, and a graphical representation of how data flows from component to component of the system. This will provide a graphical representation of the software and systems functions.

A network diagram: A network diagram includes all system network devices with all related interconnections. The network diagram should provide information not only about the network (routers, firewalls, etc.), but should include information about communications security, such as VPN and link encryption.

Software inventory: A listing containing all the software components that support the system, this should include:

• full software name with current version;

• patch level if applicable;

• simple description of how the software is used as it relates to the overall information system.

Hardware/firmware inventory, including:

• patch level if applicable;

• simple description of how the hardware/firmware is used as it relates to the overall information system.

All documents related to the certification and authorization of the system. These include but are not limited to:

• previous test results (to include annual assessments and audits);

• system categorization documents (FIPS 199, CNSS 1199, NIST 800-60);

• for DOD: system identification profile, DIACAP implementation plan, security artifacts ;

• the systems security plan (SSP), to include common controls and system specifics;

• risk assessment (RA);

• configuration management plan (CMP), change control board charter, and related meeting notes;

• business impact assessment (BIA), continuity of operations plan (COOP), disaster recovery plan, and/or contingency plan;

• privacy impact assessment (PIA);

• e-authentication risk assessment;

• system standing operating procedures (SOP);

• plan of actions and milestones (POA&M);

• policies, procedures, guidelines, and any other documentation that support the security control compliance related to the system.

If you are conducting a test for the government in accordance with NIST standards a typical system description includes:

• the name of the information system;

• a unique identifier for the information system (OMB-300, or agency/component registration number);

• the status of the information system with respect to the system development life cycle (phase);

• the name and location of the organization responsible for the information system;

• contact information for the information system owner or other individuals knowledgeable about the information system;

• contact information for the individual(s) responsible for the security of the information system;

• the purpose, functions, and capabilities of the information system;

• the types of information processed, stored, and transmitted by the information system;

• the boundary of the information system for operational authorization (or security authorization);

• the functional requirements of the information system;

• the applicable laws, directives, policies, regulations, or standards affecting the security of the information and the information system;

• the individuals who use and support the information system (including their organizational affiliations, access rights, privileges, and citizenship, if applicable);

• the architecture of the information system;

• hardware and firmware devices (including wireless);

• system and applications software (including mobile code);

• hardware, software, and system interfaces (internal and external);

• information flows (i.e. inputs and outputs);

• the network topology;

• network connection rules for communicating with external information systems;

• interconnected information systems and unique identifiers for those systems;

• encryption techniques used for information processing, transmission, and storage;

• public key infrastructures, certificate authorities, and certificate practice statements;

• the physical environment in which the information system operates; and

• web protocols and distributed, collaborative computing environments (processes, and applications) (NIST SP 800-37).

Understand the organization and mission: In order to develop a more comprehensive test plan that will test the system or every system component against the control objectives, and help to provide the AO with risk based information specific to their organization, it is very important to understand as much of the risk associated with the organization as possible. In order to better understand the risks associated with an organization, you must gain a strong understanding of the organization and its related mission.

You must know how the organization works and how the system or facility you are going to test supports the organization and mission. You can gather much about an organization from public websites, news releases, and information provided by the system owners or stakeholders.

If the organization’s information security program is fairly mature, critical assets that support the organization and mission will be identified and documented as such. Understanding the organization and mission will also help to provide information on the scope of the test. For instance, if you have been given very limited time or resources to conduct an assessment of the organizational assets – you should choose the assets that are the most critical in the boundary you are testing first.

Understand the compliance environment: Understanding the compliance environment can mean several things. This includes understanding which one of the accreditation methods will be used, such as NIST, NIAP, DIACAP, DCID, or CNSS, etc.

It also requires you to understand any other type of compliance requirements, such as HIPAA, Sarbanes Oxley, FISCAM, PCI, or even agency specific requirements. This will allow you to identify the baseline and tailored controls that will need to be tested if they are not explicitly outlined in the system security plan (SSP).

You should also understand any supplemental and compensating controls that may be required for the system you are going to test. For example, you may plan on testing a DOD system and the military component is the US Army. So you will not only be responsible for testing for the DOD baseline IA controls, but also the US Army IA controls (AR 25-2/380-5). If the system contains personally identifiable information (PII) and medical records, you may also be required to test OMB privacy controls and HIPAA controls. The system security plan should clearly identify the baseline, tailored, supplemental, compensating, or additional requirements, but if it does not you will still be required to include them in the test plan.

Understand the system and related components: In order to test an entire system, you must first decompose the system into its various components and sub-components. Most systems consist of several components and each component must meet the security control baseline and tailored baseline set forward by the organization.

For example, if you had a web-based system, you would most likely have a web server, application server, and a database – each system would also have hardware components, such as servers, telecommunications/network infrastructure and possibly external storage. Next, on each one of the hardware components, you would have an operating system (OS) and possibly several supporting applications.

By identifying every component a system consists of, you can more easily ensure that you are planning to test every control related to the system or facility. For example, you may have to test the identification and authentication (I&A) for the OS, the application, the web server, and the database in order to demonstrate overall compliance of the I&A control for an individual information system.

Decomposing the system into various components and sub-components also allows you to more easily choose the test scripts and tools you will use for the testing.

Finalize the test team: Now that you understand the system, the organization, and have performed all the necessary preliminary descoping of controls, it is time to select team members and finalize the test team. Although tests may be limited by time and budget, it is important to put together a team that will provide the most control coverage and understanding of the controls.

This will require a mix of management, compliance, and technical staff. At a minimum you will want to have a team leader that has conducted several similar tests in the past. In addition, just as you tailor specific test scripts for each component of the system testing, you may need to select the personnel that relate to specific test objectives.

NOTE: If you are conducting testing for the DOD, your assessors/testers will be required to meet the educational and certification requirements in DOD 8570-1M. If you are testing for any other organization, ensure you have undertaken due diligence in finding out what specific requirements the organization has for assessors/testers.

It has been our experience that a good testing team needs to have more than just strong technical backgrounds, but also great verbal communication and interpersonal skills. Remember the tester in the federal government is not just supposed to find weaknesses and vulnerabilities, but also provide insight into how the weaknesses and vulnerabilities can be mitigated and remediated.

4. Finalize the test plan: Security controls assessors have flexibility in organizing a security assessment plan that meets the needs of the organization and that provides the best opportunity for obtaining the necessary evidence to determine security control effectiveness. To save time, reduce assessment costs, and maximize the usefulness of assessment results, security control procedures (or parts of procedures) can be combined or consolidated whenever possible or practicable. For example, you can plan to consolidate interviews with key organizational officials dealing with a variety of security-related topics.

Assessors may have other opportunities for significant consolidations and cost savings by organizing groups of related policies and procedures that could be examined as a unified entity in the plan. Obtaining and examining configuration settings from similar hardware and software components within the information system is another example that can provide significant assessment efficiencies.

Also, consider the sequence in which security controls are assessed in the planning process. The results of testing some security controls before others may provide information that facilitates the understanding and assessment of other controls. For example, security controls such as CM-2 (baseline configuration), CM-8 (information system component inventory), PL-2 (system security plan), RA-2 (security categorization), and RA-3 (risk assessment) produce general descriptions of the information system. Assessing these security controls early in the process may provide a basic understanding of the information system that can help in assessing other security controls.

Planning will not only save time, money, and frustration and increase performance, but also help to ensure your staff fully understand and can support the scope and goals of the test. This will allow you to ensure the test provides a full coverage of the information system controls and time is not wasted testing security controls that are either out of scope or non-applicable. As a final note, planning for the security controls testing and actually executing the test plan is not always sequential in nature. So, although the security control testing steps are presented in a sequential manner, it does not mean that several objectives cannot be handled simultaneously.

Based on the planning steps you have conducted thus far, you should have sufficient data to prepare the test plan. The test plan needs to answer the following questions:

What type of test is being conducted (e.g. independent validation/certification/annual assessment/audit, etc.)?

What are the key objectives?

What is the scope of the test (this should correlate directly to the accreditation boundary)?

What security or IA controls will be covered?

Who will be conducting the test? What are the team members’ roles and responsibilities?

Where will the test/s be conducted?

How long will the test be? What are the key dates for testing, interviews, kick-off, and out-brief meetings?

What is the testing team’s understanding of the rules of engagement?

What test scripts and tools will be used (automated and manual)?

What assumptions does the test team have?

Is there any miscellaneous information related to the test, such as terms and conditions of the test, who will receive the test report, how the team will deal with false positives, and customer review period?

Once the security assessment plan is completed, the plan is reviewed and approved in writing by appropriate organizational officials. This helps to ensure that the plan is complete, consistent with the security objectives of the organization and the organization’s assessment of risk, and cost-effectively uses the resources set aside for the assessment.

Federal agencies often have their own mandatory format for writing the test plan. We have expanded on the content of the test plan in Chapter 10 and included examples on the CD accompanying this text. One is a written test plan based on the NIST SP 800-53 standards. Another is an example of a security requirements traceability matrix (SRTM).

The SRTM, which can be a subset of the overall requirements traceability matrix (RTM) for the information system, can be used to formally document the specific information system security design requirements and to ensure that these are met. The SRTM document should be formatted so that it can be easily evolved into later serving as the security control test plan.

Execute the security controls test

After completion of the test plan, identification and preparation of the test tools and methods, and development of any additional test scripts, the actual test is scheduled and testing assignments are made.

Figure 13: Approach to security control testing

5. Execute the assessment: It is often useful to conduct a dry run of the test on a test bed or virtual information system/network prior to actually putting “hands on” the operational information systems. A dry run verifies that all test objectives, test procedures, and test scripts are appropriate for the system being tested.

The security control assessment team may also include the system owner, ISSO, system and/or network administrators in the dry-run test to assist with refining the test plan. If any of the information contained in the plan is inaccurate or if the proposed tools/methods are not appropriate for the target information system(s), the plan can be corrected prior to formally executing it.

The actual execution phase of the security controls testing can last a couple of days to several weeks – depending upon the size, complexity, and life cycle of the information system. However, the security control assessment team should try to minimize the time on site with the organization as much as possible.

Hold a kick-off meeting: Once arriving at the test location, it is often useful for the security controls assessor to provide the organization’s leadership with an overview of the proposed test plan. The control testing kick-off meeting plays an important role in ensuring the system stakeholders of the system being tested have a good understanding of the entire test plan.

This includes understanding what is going to be required of them and their staff; the dates/time of the testing; the components that will be tested, and allows for rules of engagement to be formally vetted and approved. The kick-off meeting is also a good opportunity to obtain answers to any open questions and gather any documentation that you have not received up to this point.

It is always good to go into the kick-off meeting with a project management plan of all of the testing components, dates, who is assigned from the testing team and who is the primary POC from the stakeholders being tested. If you still have a few blanks in your project management plan (PMP), the meeting provides an opportunity to fill those in with the stakeholders in the same room.

Also, ensure you clarify any operational issues regarding the time required for the assessment and the assets being tested. Explain whether your test will obtrusive or non-obtrusive, the tools you will be running, the average time each test takes, and provide the management stakeholders with the contact information of the test management and team leaders should they need to contact anyone during the test.

We always find it works well to include the above information in a presentation where team members can take notes and interact with the system stakeholders. This also helps put “names to faces,” which allows for a smoother interview process and configuration-related testing. A sample entrance briefing is included on the CD accompanying this book.

Execute the testing: The security control assessment team and the assigned system personnel will execute the tests in accordance with the test procedures described in the plan. System administrators or other technical personnel should ideally be present during the execution of the actual test. They serve the dual purpose of witnessing the tests and providing subject matter expertise on the target system(s). In fact, multiple technical personnel may be required to execute and/or assist with the test. For example, a team member with training and experience in the UNIX operating system might be needed for specific test procedures or one with Oracle database administration training and experience might be necessary for database testing.

Prior to executing the test plan, the security control assessor and the assessment team should work with system personnel to ensure the following:

All system components scheduled for testing are available and operational.

Required organizational personnel are available to participate in the process.

During our careers, we have performed hundreds, possibly thousands of security control tests. One thing has always been consistent across all of our experiences with security control testing: the results obtained from the test often depended on the assessor or tester and not necessarily on the actual requirements. This means that results could vary greatly depending on the individual(s) executing the tests.

Recently, Waylon had a chance to work with many of the brilliant people at NIST and a group of very talented security professionals collaborating in the development of NIST SP 800-53A. Now, instead of a tester spending hours or even days describing a unique step-by-step process on how you should execute your testing, NIST and its team of professionals have already spent thousands of hours refining a process that will allow us to provide a consistent, effective – and hopefully successful – control test.

However, to most effectively use NIST SP 800-53A, you should understand some of the terminology used throughout the publication:

Assessment objects: identify the specific items being assessed and include specifications, mechanisms, activities, and individuals.

Specifications: the document-based artifacts (e.g. policies, procedures, plans, system security requirements, functional specifications, and architectural designs) associated with an information system.

Mechanisms: the specific hardware, software, or firmware safeguards and countermeasures employed within an information system.

Activities: the specific protection-related pursuits or actions supporting an information system that involve people (e.g. conducting system backup operations, monitoring network traffic, or exercising a contingency plan). Individuals, or groups of individuals, are people applying the specifications, mechanisms, or activities described above.

Assessment methods: define the nature of the assessor actions and include examine, interview, and test.

As the test is being executed, the security control assessment team should document the initial results for each test executed. Although some preliminary analysis can be conducted, it is important to take the time to carefully review actual against expected results in order to correctly determine the compliance status of the security control.

After the identified system components have been tested and initial results documented, an informal exit brief should be given to the organization’s leadership and designated representatives. (There is a sample exit briefing on the CD accompanying this text.) This informal exit brief is intended to provide the organization, especially the system owner and other system personnel, with preliminary findings (e.g. “not met” test objectives) resulting from the test execution.

Some assessment teams may even choose to ensure that functional area results are shared, integrated and incorporated into the subsequent assessment action strategy on a daily basis. The primary means for this information sharing will be a daily “hotwash” meeting. The hotwash is usually held at the end of each work day and attended by all team members and representatives from the assessed organization. The group will review the results obtained during the day and discuss plans and strategies for the following day.

Early disclosure of the findings, particularly any critical findings, gives the organization the opportunity to take immediate corrective action. In fact, many corrective actions can be implemented in real time with the security control tests (and in some cases should be) before the report is finalized. This allows for corrected findings to be documented in the final security control test report.

Analyze, document, and report the results in the security assessment report (SAR)

At the conclusion of the security controls assessment, the assessment team should analyze the results, document the findings, and prepare a security assessment report (SAR).

6. Analyze the results: One of the most difficult tasks associated with performing a security control assessment is analysis of the results. After you compile all of the results from all of the tests (manual and automated), you will often find that some of the actual results do not match up completely with the stated expected result.

Basically, it is highly likely that you will have some false positives. These will need to be evaluated and weeded out of your test results before putting them into the final report.

For example, the assessors may have interviewed the system administrator about the system access policy and – based on the interview – the control appeared to be compliant; however, your automated test scripts found several issues in the configuration of the access policy of the system. This is where your team’s experience and raw ability comes into play. An experienced tester will note the discrepancy and be able to evaluate its applicability and validity.

Results of the control assessment may ultimately influence the content of the security plan and the plan of action and milestones (POA&M). As a result, it is important for the information system owner to review the findings of the assessor with the designated organizational officials (e.g. authorizing official, chief information officer, senior agency information security officer, mission/information owners) and determine the appropriate steps required to correct weaknesses and deficiencies identified during the assessment.

Before applying the actual determination of compliant, non-compliant or not applicable to the security control, it may be useful to initially use tags of “satisfied” and “other than satisfied” for the security controls assessment results. This provides visibility for organizational officials of specific weaknesses and deficiencies in the information system and facilitates a disciplined and structured approach to reviewing risk mitigation in accordance with organizational priorities.

For example, the information system owner, in consultation with designated organizational officials, may decide that certain assessment findings marked as “other than satisfied” are of an inconsequential nature and present no significant risk to the organization. Alternatively, the system owner and organizational officials may decide others are significant, requiring immediate remediation actions.

In all cases, the organization reviews each finding of “other than satisfied” and applies its judgment with regard to the severity or seriousness of the finding (i.e. the potential adverse effect on the organization’s operations and assets, individuals, other organizations, or the nation), and whether the finding is significant enough to be worthy of further investigation or remedial action.

Senior leadership involvement in the mitigation process may be necessary in order to ensure that the organization’s resources are effectively allocated based on organizational priorities. Resources should be provided first to the information systems that support the most critical and sensitive missions for the organization or for correcting the deficiencies that pose the greatest degree of risk.

Ultimately, the assessment findings and any subsequent mitigation actions initiated by the information system owner, in collaboration with designated organizational officials, trigger updates to the risk assessment and the security plan. Therefore, the key documents used by the authorizing official to determine the security status of the information system (i.e. security plan with updated risk assessment, security assessment report, and plan of actions and milestones) are updated to reflect the results of the security control assessment.

The security assessment report (SAR): The SAR includes all of the information from the assessor (in the form of assessment findings) necessary to determine the effectiveness of the controls employed in the information system and the organization’s overall security effectiveness determination. The SAR is an important factor in an authorizing official’s determination of risk to organizational operations (i.e. mission, functions, image, or reputation), organizational assets, individuals, other organizations, and the nation. This final step is as important – if not more so – than the actual execution of the assessment.

The report should follow guidelines, such as those in NIST Special Publication 800-53A, or utilize a customer-specified report format. It should be written clearly and comprehensively – and remember, the target audience is management. At a minimum, the SAR should include:

Security control number or IA control number: The security control number refers to the families of security controls as described in the baseline controls being used.

Finding number: The number in order of the test findings that warranted a recommendation.

Risk level: Level of risk if the finding is not mitigated. This is usually documented as low, moderate, or high.

Requirement: The security control from NIST SP 800-53, DODI 8500.2, CNSS 1253, or other.

Recommendation: The recommended mitigation.

You can find examples of security control test report templates on the CD accompanying this text. The SAR is used to present the following data to the certifying authority and the authorizing official:

Results of the security controls assessment (i.e. a determination of the extent to which the security controls are implemented correctly, operating as intended, and producing the desired outcome with respect to meeting the security requirements for the system); and

Recommendations for correcting or mitigating deficiencies in the security controls and reducing or eliminating identified vulnerabilities.

The final report should reflect the fact that you conducted an assessment – not an audit. It should focus on describing the findings and provide recommendations for remediation. It should NOT point out wrong-going or try to point to failures from any specific individual. The report, however, should clearly state the outcomes for the security control tests.

There are four possible outcomes to the test for each individual security control:

Compliant (C) – the result of the testing demonstrates that the security control fully meets the expected results.

Non-compliant (NC) – the result of the testing demonstrates that the security control does not meet the expected results.

Non-applicable (NA) – the security control is not applicable for the information system and/or its current configuration.

Not tested (NT) – the security control cannot be tested at this time – or at all.

The results of all executed security control tests will be formally documented. These results are also summarized in a written test report that is provided to the requesting organization upon test completion. Further details on the preparation of the test report are provided in Chapter 10 and a template for a report format is on the accompanying CD.

The purpose of a test report is to not only document the security control test results, but also to provide results of analysis of the testing, i.e. what failed and why. The test report may also make recommendations for further testing or engineering efforts.

DEVELOP the plan of action and milestones (POA&M)

At its most basic, the plan of action and milestones (POA&M), also referred to as a corrective action plan, is a management tool that outlines identified information system security program and information system weaknesses along with the actions necessary to mitigate them. To many, the term POA&M strikes fear and loathing in their hearts. However, the POA&M is actually a positive resource for planning and monitoring corrective actions. It defines roles and responsibilities for solving problems; assists in identifying security funding requirements; tracks and prioritizes resources; and informs decision makers.

A POA&M also provides an organization with a capability to highlight progress and demonstrate improvements in the quality of its information security program. It can also serve as a management tool and a point of comparison for an assessment of the organization’s overall maturity in comparison to the security status of other federal agencies.

Importance of the POA&M – $$$$

The Office of Management and Budget (OMB) uses the POA&M to assess the state of the federal government’s information system security and to assist OMB in providing oversight of the federal government’s IT investments. According to OMB direction, agencies are required to link their POA&Ms to the IT budgeting process.

The official word is that OMB penalizes agencies that do not adequately define and implement a plan to mitigate information system security weaknesses. Agency IT programs could be placed at risk and lose critical IT and security funding. In the case of major IT investments, OMB requires the POA&Ms to be cross-referenced through answers to an Exhibit 300 and Exhibit 53.74

Table 20: Type of information from Exhibit 300 and Exhibit 53 included on a POA&M

The security costs column captures the amount required for security based on the following question: “What is the total dollar amount allocated to IT security for this investment?” The response to this question in this section of the Exhibit 300 should align with resources required to remediate the weaknesses reported in the POA&M for that information system.

74 Exhibit 300, or the capital asset plan and business case, is a format used by OMB to make quantitative decisions about budgetary resources consistent with the Federal Administration’s program priorities, and qualitative assessments about whether the agency’s programming processes are consistent with OMB policy and guidance. Exhibit 300 also supports compliance with the requirements of the 1996 Clinger Cohen Act. Exhibit 53 summarizes federal agency IT spending on all major and non-major IT investments, and is provided to OMB annually as part of the budgeting process.

75 Exhibit 300 is submitted for every major system or IT investment. See definitions for the criteria for consideration as a major system.

76 The unique project identifier (UPI), a 23-digit number, must be created for all IT projects. The UPI is based, in part, on the sub-functions found in the business reference model. It will be used on the specific 300 for the investment as well as the Exhibit 53.

How the POA&M fits into the information system security evaluation

Congress uses the agency’s own Inspector General’s evaluation of its POA&M process in determining the congressional security report card. In fulfilling its oversight role, Congress obtains information about an agency’s information system security activities and its FISMA compliance.

Agencies release to Congress, as requested, the following information from their POA&Ms: (1) type of weakness; (2) key milestones; (3) any milestone changes; (4) source of identification of the weakness; and (5) the status of the weakness.

In addition, having a sound POA&M process is essential to achieving a high score on the President’s Management Agenda (PMA). The PMA is a strategy for improving the management and performance of the federal government.

Benefits of the POA&M process

Although most agencies view POA&Ms as a necessary evil, there is actually benefit to be achieved from the process of developing and tracking POA&Ms. A good POA&M process has other benefits, such as:

Producing valuable trending and analysis: The POA&M can be used as a historical data source for management on the costs, effort, and time to mitigate system security weaknesses. The type of weaknesses can be tracked, as well as the rate of their recurrence. The POA&M provides the ability to conduct analyses by system, program, or across the entire agency.

Supporting IT business cases: A comprehensive POA&M that contains accurate and reliable financial estimates can provide traceability, as well as the justification for additional security funds.

Maintaining institutional knowledge: A well-executed and maintained POA&M prevents reliance on specific individuals as “the single point of failure” to retain and communicate historical information relevant to an information system or an entire program.

Facilitating effective communication: The POA&M can serve to facilitate communication and coordination among agency personnel on information system security issues, such as the chief information officer (CIO), information system security officer (ISSO), budget personnel, or program officials.

As a result, the POA&M can provide agencies with tighter control over their IS security program and increase the efficiency of the agency’s IS security management.

The POA&M process of weakness remediation

Weakness remediation is a cyclical process with the steps depicted in the following figure. What follows is a brief description of this process. The security controls assessment process was completed in the preceding step.

Note that an assessment is not the only means of identifying system security weaknesses that should eventually wind up on the POA&M. Other means of identifying system security weakness include:

IG inspections

Government Accountability Office (GAO) audits

Self-assessments

Independent assessments

Penetration tests.

Figure 14: Weakness remediation cycle

The weaknesses are analyzed to identify if there is a risk as a result of that weakness. In some cases, a specific corrective action may not be necessary when analysis reveals that the weakness can be considered an acceptable risk. In this case, management should sign-off that the risk is acceptable. In this case, the weakness does not have to be included in the POA&M.

Even in this case, a record of the risk acceptance must be documented, particularly since risks can change over time. The agency should record the rationale for accepting the risk. And the weakness should be reviewed periodically for changes in the acceptable risk level.

If analysis of the weakness indicates that action is required to mitigate risk, the agency should identify both the required corrective actions and the required resources. There are often multiple ways to mitigate a weakness, so the various methods should be analyzed for appropriateness in resolving the weakness fully and viewed for long-term implications. At this time, costs for each corrective action plan option must be estimated and analyzed to determine short-term and long-term solution capabilities.

The agency can usually obtain resources for weakness remediation through the following methods:

Using current resources marked for security management of the system or program.

Reallocating existing funds or personnel.

Simply requesting additional funding.

If new or additional funding is needed, it is imperative to take advantage of the capital planning process to request the necessary funds. Integrating IT security costs into the overall capital planning process ensures that security is included in the agency’s enterprise architecture, supports business operations, and is funded within each information system over its life cycle.

At its most basic level, weakness prioritization focuses on two essential prioritization criteria: system categorization and weakness risk level. System categorization should have already been determined as part of the system’s risk assessment based on the Federal Information Processing Standards (FIPS) Publication 199, Security Categorization of Federal Information and Information Systems. According to FIPS 199, system categorization should be classified as high, moderate, or low, according to confidentiality, integrity, and availability criteria.

The second criterion for basic weakness prioritization is the potential impact of the weakness on the organization if it is left unresolved. The identified weakness’ potential impact will be listed in the POA&M as high, medium, or low.

The estimated date of completion for the mitigation or removal of each weakness should be based on realistic timelines for resources to be obtained and the associated steps to be completed. Be sure to consider that while it may take 30 days to complete specific weaknesses individually, it may be impossible to address all these weaknesses concurrently, particularly if using the same resources. The completion date should ideally be based on the outcome of prioritization decisions and resource availability.

The POA&M provides the structure and consistency in the presentation of information about system security weakness and the mitigation plan. As a result, you can use the POA&M as a means of documentation, as well as a means to track completion of mitigating actions.

The information in the POA&M should be maintained continuously. Both DOD and the federal government require a quarterly status report to communicate overall progress in identifying and mitigating weaknesses. The table below shows the information required for the quarterly report.

Table 21: Quarterly POA&M report

POA&M updated information

Programs

Systems

Total # of weakness identified at the start of the quarter.

 

 

# of weaknesses for which corrective action was completed on time including testing (by the end of the quarter).

 

 

# of weaknesses for which corrective action is ongoing and is on track for completion as originally scheduled.

 

 

# of weaknesses for which corrective action has been delayed and a brief explanation for the delay.

 

 

# of new weaknesses discovered following the last POA&M update and a brief description of how the security weakness(es) were identified (e.g. IG inspection, audit, self-assessment).

 

 

Finally, FISMA guidance for DOD and federal agencies directs that a completed status will only be assigned when a weakness has been fully mitigated and compliance has been validated. Therefore, it is imperative to incorporate follow-on testing of the affected security control(s) into the weakness mitigation process.

A typical POA&M and instructions on its completion are provided in Chapter 10 and a template is available on the accompanying CD. NOTE: Before completing your POA&M, you should check the most current guidance from OMB. This can be found at http://www.whitehouse.gov/omb .

Summary

If properly used, the POA&M’s benefits can be significant and far-reaching. For federal agencies, the POA&M can provide a comprehensive reference that can be used to support ongoing efforts to address programmatic and system-specific vulnerabilities. The POA&M can be an essential management tool for the oversight and mitigation of security weaknesses.

To function as an effective management tool, the POA&M must be continually and diligently updated. Changes in the operating environment, levels of acceptable risk, and the availability of resources occur on a frequent basis. An effective and successful POA&M process will identify and respond to each one of these changes in a concise and complete fashion.

A mature POA&M program requires that the knowledge and efforts are sustainable over time and independent of any one person or personnel function. As a central repository, the POA&M eliminates the reliance on one resource and secures institutional knowledge. Although the POA&M initially represents a resource and time-intensive effort, if properly maintained, it has the potential to be a valuable tool for management and will improve the overall IT security posture.

AUTHORIZE the operation of the information system

In order to initiate the final step of this phase, the system owner must submit a security authorization package to the authorizing official. The security authorization package documents the results of the security control testing and provides the authorizing official (AO) with the essential information needed to make a credible, risk-based decision on whether to authorize operation of the information system.

The security authorization package

Unless specifically designated otherwise, the system owner is usually responsible for the assembly, compilation, and submission of the security authorization package for the authorizing official. Of course, the system owner cannot do this alone. The system owner receives inputs from the ISSO, certifying agent, security controls assessor, and others during the preparation of the security accreditation package.

Federal laws and regulations mandate that certain documentation must be included in the authorization package. The minimum documentation includes:

The system security plan (SSP)

The SSP is usually prepared by the system owner, provides an overview of the security requirements for the information system and describes the security controls in place or planned for meeting those requirements. System security plans can be developed in accordance with NIST SP 800-18.

The objective is to use the SSP as an evolving, but binding, agreement on the level of security required from initiation of the system development or as changes to a system are made. After authorization, the SSP becomes the baseline security document. The uses of the SSP during the initial authorization phases and during the post authorization phase are shown below.

Figure 15: SSP use during and post authorization

Although the term system security plan is commonly used for the basic security overview documentation, other agencies or environments might use other terminology. DOD currently requires a system identification profile (SIP) and a DIACAP implementation plan (DIP). You may also see a requirement for a system security authorization agreement (SSAA), although the use of this document in DOD became obsolete with the signing of the DODI 8510.01, DOD Information Assurance C&A Process (DIACAP). The SSP may also include supporting documentation:

A technical architecture document, which can be prepared as a supplement to the system security plan for information systems to provide more detail on the system description, its environment, and interconnectivity.

A hardware and software inventory, which may also be included as part of the configuration management process.

Assessment summary report

The assessment summary report summarizes the results of the security controls assessment and is usually prepared by the security control assessor. This summary provides an overview of the degree to which the controls have been implemented correctly, are operating as intended, and producing the desired outcome with respect to meeting the specified security requirements. The summary report provides the authorizing official with a synopsis of the findings, which are documented in detail in the security assessment report. The security assessment report should also contain recommended corrective actions for weaknesses or deficiencies identified in the security controls.

A plan of action and milestones (POA&M)

The POA&M identifies tasks that need to be accomplished to mitigate risks to an information system. The POA&M is initiated by the certifying agent for use by the system owner, and it details resources required to accomplish the elements of the plan, any milestones in meeting the tasks, and scheduled completion dates for the milestones.

The certification statement

The certification statement is prepared by the certifying agent to provide information to the designated approving authority to permit an informed decision regarding the secure operation of the system. The statement provides a summary of the results of certification testing; highlights certification activities; records the degree to which security controls are correctly implemented and effective; identifies the level of risk to system assets and to the agency’s operations and personnel; states the level of compliance with statutory and regulatory requirements; and documents the certification level of the system.

The security authorization may also contain additional documents or references, including key security-related documents for the information system such as the privacy impact assessment (PIA), incident response plan, configuration management (CM) plan, security configuration checklists, rules of behavior, and any system interconnection agreements. Additional documents that are part of the evidence required in the authorization process, but may not necessarily be provided to the authorizing official as part of the authorization decision process, include:77

The risk assessment prepared by the system owner and approved by the authorizing official, which documents risks to the information systems by identifying system assets, evaluating threats to these assets, and vulnerabilities to safeguards protecting system assets. The risk assessment evaluates the effectiveness and applicability of the minimum security baseline control set and recommends adjustments to minimum safeguards according to system-specific risks. The risk assessment will follow a standard methodology approved by NIST (see NIST SP 800-30).

The contingency plan documents management policy and procedures that are designed to maintain or restore business operations supported by the system, potentially at an alternate location, in the event of emergencies, system failures, or disaster. However, a contingency plan is not required when the availability of system resources is covered by a contingency plan for another system (i.e. general support system). Contingency plans will be developed in accordance with the standard methodology approved by NIST (see NIST SP 800-34).

The security test and evaluation (ST&E) plan and results report, prepared by the certifying agent and approved by the system owner and authorizing official, documents the plan for

77 Samples of these and other documents relevant to the security authorization process are provided on the accompanying CD.

certifying the system and provides the results of the assessment of security controls in the system to determine the extent to which the controls are implemented correctly, operating as intended, and producing the desired outcome. The ST&E plan is generally submitted and approved prior to the beginning of certification testing.

Importance of the certifying authority and the certification statement

The certifying authority78 (CA) may be an individual, group, or organization responsible for conducting a security certification. This certification is based on a comprehensive assessment by the security controls assessor of the management, operational, and technical security controls in an information system to determine the extent to which the controls are implemented correctly, operating as intended, and producing the desired outcome with respect to meeting the security requirements for the system.

The CA also provides recommended corrective actions to reduce or eliminate vulnerabilities in the information system. Prior to initiating the security assessment activities part of the certification process, the CA may also provide an independent assessment of the system security plan (SSP) to ensure the plan provides a set of security controls for the information system adequate to meet all applicable security requirements.

To preserve the impartial and unbiased nature of the security certification, the CA should be in a position independent from the persons directly responsible for the development and deployment of the information system or the day-to-day operation of the system.

The CA should also be independent of the organization responsible for correcting security deficiencies identified during the security controls testing process. The CA’s independence is an important factor in assessing the credibility of the security assessment results

78 Also called certifying authority, certification agent and certifying agent.

and ensuring the authorizing official receives the most objective information possible in order to make an informed, risk-based, accreditation decision. The CA should work closely with the risk executive (function) on assessing the relative risk to the information system posed by the non-compliant security controls.

The certification determination and the associated certification statement to the authorizing official is the certifying authority’s validation of the system’s compliance with the security controls, the identification and assessment of the risks associated with operating the system, and the balance with the cost to correct or mitigate the security weakness.

The certification determination is based on the actual security control test results. The impact associated with a security control in a non-compliant status, the expected exposure time (i.e. the projected life of the system release or configuration minus the time to correct or mitigate the security control weakness), and the cost to correct or mitigate (e.g. dollars, functionality reductions) are considered.

Certification considers:

The security posture of the information system itself – the overall reliability and viability of the information system, plus the acceptability of the implementation and performance of the security control mechanisms or safeguards inherent in the system itself.

How the information system behaves in the larger information environment. In other words, does the information system introduce vulnerabilities into the environment, does the system correctly and securely interact with information environment management and control services, and is its visibility to situational awareness and information system defense services adequate?

The security authorization decision

The security authorization decision is made by the authorizing official and represents a balance of mission or business need, protection of personal privacy, protection of the information being processed, and protection of the information environment, and thus, by extension, protection of other missions or business functions reliant upon the shared information environment.

The authorization package forms the basis for the authorizing official’s decision. The CA’s recommendation is often heavily weighted in the authorizing official’s considerations.

In addition to the CA’s recommendation and the package content, the authorizing official should consider many factors when deciding if the risk to agency operations, agency assets, or individuals of operating an information system is acceptable. Balancing security considerations with mission and operational needs is paramount to achieving an acceptable accreditation decision. The authorizing official renders an accreditation decision for the information system after reviewing all of the relevant information and, where appropriate, consulting with key agency officials.

The authorization statement documents the security authorization decision from the authorizing official. The authorization statement usually contains the following information:

security accreditation decision;

supporting rationale for the decision;

summarization of corrective actions required;

identification of residual risks;

limitations to operations; and

terms and conditions for the authorization.

There are two very basic “authorization to operate” decisions: the authorization statement indicates to the information system owner whether the system is either authorized to operate or is not authorized to operate. The supporting rationale provides the information system owner with the justification for the authorizing official’s decision. The terms and conditions for the authorization provide a description of any limitations or restrictions placed on the operation of the information system that must be adhered to by the information system owner. The authorization statement is attached to the original package and returned to the information system owner.

Authorization to operate (ATO)

An ATO decision is issued by the authorizing official if, after a review of the authorization package, the risk of operating the information system is determined to be acceptable. An ATO usually means that the information system has no significant restrictions or limitations on its operation. It does not imply that there were no weaknesses determined in the security controls assessment process. In fact, we have not seen an information system that had NO weaknesses.

An ATO does mean that, wherever it is cost effective to do so, the organization should take specific actions to reduce or eliminate identified weaknesses. An ATO generally lasts for three years before re-authorization is required – barring any significant security-related events79 that might require a re-authorization.

In summary, an ATO (full accreditation) can be granted under the following conditions:

The certification package is complete.

No corrective actions are required.

Residual risks are acceptable to the DAA.

Interim authorization to operate (IATO)

If the authorizing official determines that the weaknesses associated with operating an information system are not acceptable, but there

79 The definition of “significant security-related events” may differ across organizations. It is critical for your organization to define what this means for you. Examples include changes in location, changes in information sensitivity level, changes in the operating system, and/or a major security incident.

is a mission requirement for the information system to be placed into operation or to continue operating, the authorizing official may choose to issue an IATO. An IATO is acceptable only when the identified security weaknesses resulting from the security controls testing are significant, but further analysis indicates they could reasonably be addressed in a timely manner.

An interim authorization allows the information system to operate under specific terms and conditions. It acknowledges that the operation of the information system may pose increased risk to the organization for a specified period of time. The IATO will contain the authorizing official’s terms and conditions for operation of the information system.

An IATO accreditation decision is intended to manage system security weaknesses. It is not intended to be a device for signaling an evolutionary approach to information system security. Nor should the IATO be considered a preliminary step to acquiring full authorization to operate. Information systems will continue to change throughout their lifetime, so an organization should not try to wait for an ATO to be granted only for information systems for which no change is planned or foreseen. Such thinking can engender an abuse of the real purpose of the IATO and present an inaccurate portrayal of an information system’s security posture.

To summarize, an IATO may be employed in the following situations:

A new system is in an advanced test phase and may be deployed prior to final design and test of operational capability.

A survey has concluded that there are no apparent security problems that would allow unauthorized persons to access data in a system, but there has been insufficient time or resources for rigorous hardware and software testing.

The configuration of an operational system has been altered. Initial security evaluation by appropriate personnel does not reveal any severe problems, but a full evaluation has had scheduling delays.

A system that will be fielded at multiple sites has been evaluated in a test environment. Full accreditation will occur when it is finalized for deployment.

Denial of authorization to operate (DATO)

The authorizing official always has the option to specifically deny an information system the authorization to operate.80 A DATO is issued in those cases where the authorizing official considers the risk of operating the information system to be unacceptable. In this case, the information system should not be placed into operation. If it is currently operating, all activity should be halted.

A DATO usually indicates that there are significant security control weaknesses. The authorizing official or designated representative may often set a suspense for the implementation of proactive measures to correct the security weaknesses in the information system and to reapply for authorization to operate.

The DOD allows for an additional authorization category – the interim authority to test.

Interim authority to test (IATT)

The IATT authorization decision is a special case that allows an information system – potentially still in developmental status – to conduct essential tests in an operational information environment or with live data for a specified period of time. An IATT may not be used to avoid ATO or IATO validation activity and certification determination requirements for authorizing a system to operate.

80 The current NIST SP 800-37 and DODI 8510.01 specifically call out the DATO as an authorization option. The proposed revision of NIST SP 800-37 only discusses ATO and IATO, assuming that the absence of either is an indication of an implicit DATO.

Accreditation decision letter

The authorization decision letter transmits the authorizing official’s accreditation decision. The AO attaches the certification documentation to the authorization letter and transmits it to the system owner.

Upon receipt of the authorization statement and the associated package, the system owner accepts the terms and conditions of the authorization. The system owner maintains the original authorization statement and package on file. The authorizing official and the IAM/ISSM should also retain copies of the security authorization decision letter. The contents of the security certification and authorization-related documentation (especially information dealing with information system vulnerabilities) will be marked and protected appropriately in accordance with agency policy, and will be retained in accordance with the agency’s record retention policy.

Milestones from the verify, validate and authorize activities

Before proceeding on to the next phase, the operate and maintain activities, let’s take a final look at what you should have achieved in this phase:

You determined your security control assessment team.

The security control test plan was planned and executed.

The results of the security controls testing were collected and analyzed.

Security control weaknesses were identified.

A plan of action and milestones was developed.

The certifying authority provided a recommendation for authorization.

The SSP, a summary of the security controls tests, the POA&M, and the CA recommendation were provided as a package to the authorizing official.

The authorizing official made an authorization decision and provided this decision to the organization.

Further reading

Gregg, Michael and Kim, David. Inside Network Security Assessment: Guarding Your IT Infrastructure. Sams Publishing: November 2005.

Rogers, Russ et al. Network Security Evaluation: Using the NSA IEM. Syngress Publications: July 2005.

References

National Institute of Standards and Technology (NIST) Special Publication 800-18, Guideline for Developing Security Plans for Federal Information Systems, February 2006.

National Institute of Standards and Technology (NIST) Special Publication 800-37, Guide for the Security Certification and Accreditation of Federal Information Systems, March 2004.

National Institute of Standards and Technology (NIST) Special Publication 800-42, Guideline on Network Security Testing, October 2003.

National Institute of Standards and Technology Special Publication 800-53, Revision 2, Recommended Security Controls for Federal Information Systems, December 2007.

National Institute of Standards and Technology (NIST) Special Publication 800-53A, Guide for Assessing the Security Controls in Federal Information Systems, July 2008.

National Institute of Standards and Technology Special Publication 800-59, Guideline for Identifying an Information System as a National Security System, August 2003.

Office of Management and Budget (OMB) Circular A-11, Preparation, Submission and Execution of the Budget, (Revised June 26, 2008). Available at http://www.whitehouse.gov/omb/circulars/a11/current_year/a11_toc.html .

Office of Management and Budget, Memorandum (M)-02-01, Guidance for Preparing and Submitting Security Plans of Action and Milestones, October 17, 2001. Available at http://www.whitehouse.gov/omb/memoranda/m02-01.html .

Office of Management and Budget, Circular A-130, Appendix III, Transmittal Memorandum #4, Management of Federal Information Resources, November 2000.

Public Law 107-347 [H.R. 2458], The E-Government Act of 2002 Title III, of this Act is the Federal Information Security Management Act of 2002 (FISMA), December 17, 2002.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset