Document not found! Please try again

SCAP Benchmark for Cisco Router Security Configuration Compliance

43 downloads 75687 Views 313KB Size Report
development of a SCAP benchmark for automating Cisco router security configuration compliance. Keywords— SCAP benchmark; XCCDF; OVAL; security.
The 10th International Conference for Internet Technology and Secured Transactions (ICITST-2015)

SCAP Benchmark for Cisco Router Security Configuration Compliance Chit Nyi Nyi Hlyne, Pavol Zavarsky, Sergey Butakov Department of Information Systems Security and Assurance Management Concordia University of Edmonton Edmonton T5B 4E4, Alberta, Canada [email protected], {pavol.zavarsky, sergey.butakov}@concordia.ab.ca Abstract—Information security management is timeconsuming and error-prone. Apart from day-to-day operations, organizations need to comply with industrial regulations or government directives. Thus, organizations are looking for security tools to automate security management tasks and daily operations. Security Content Automation Protocol (SCAP) is a suite of specifications that help to automate security management tasks such as vulnerability measurement and policy compliance evaluation. SCAP benchmark provides detailed guidance on setting the security configuration of network devices, operating systems, and applications. Organizations can use SCAP benchmark to perform automated configuration compliance assessment on network devices, operating systems, and applications. This paper discusses SCAP benchmark components and the development of a SCAP benchmark for automating Cisco router security configuration compliance. Keywords— SCAP benchmark; XCCDF; OVAL; security compliance; security automation

I.

INTRODUCTION

Managing and maintaining information infrastructure is a challenging task for organizations. Business applications and operating systems are getting more sophisticated and demand more computing power. As a result, organizations are expanding their infrastructure with physical hardware as well as virtualization. Therefore, the underlying network infrastructure becomes more critical, and it is required to handle growing volume of network traffic. Growing requirements for information confidentiality, availability, and integrity drive the need to improved management of information security on the enterprise networks. Apart from technical complexity, industry regulations, and government directives, standardization bodies bring additional regulatory requirements to organizations. Meeting these requirements lead organizations to invest in security tools and human resources. Deviation from recommended security baseline configuration settings boosts the chance of a network device being compromised. Hence, securing the high number of variants of software and hardware components, and their configuration are a demanding task for IT security department especially in changing and evolving large-scale networks. The manual check of configuration on network

978-1-908320-52/0/$31.00 ©2015 IEEE

devices involves an enormous amount of human effort. Additionally, the task of reviewing configuration manually on network devices is error-prone and often experience inconsistency. Information security management is labor intensive, taxing, and thus expensive. Organizations are looking for automation tools for streamlining security tasks. Different security applications use different naming conventions and different reporting formats. Therefore, IT security professionals are struggling to understand and interpret reports from various tools; meanwhile they are trying to get a standardized tool for managing security operations. The Security Content Automation Protocol (SCAP) can help to automate security management tasks and represent reports in a standardized way. “SCAP benchmark is a checklist that provide detailed low-level guidance on setting the security configuration of operating systems, applications, and network devices” [1]. In this paper, SCAP benchmark and SCAP checklist will be used interchangeably. The benchmark developed in this research is referenced to the recommendations of Security Technical Implementation Guide (STIG) [2], National Security Agency (NSA) router security configuration guide [3] and Cisco IOS benchmark from the Center for Internet Security (CIS) [4]. These benchmarks are manual guides for organizations; nonetheless there is no automated SCAP benchmark available publicly at this time of writing this paper. The motivation for this research is to automate manual security guides using SCAP 1.2. The SCAP benchmark developed in this research enables organizations to perform automated security configuration assessment on Cisco routers. The SCAP validated tool can compare router’s configuration with the benchmark. The automated assessment of security configuration on Cisco routers shortens the time to diagnose deviations from the desired security configuration and improves in its consistency. The security administrator can advise remediation of non-compliance configuration on routers based on the assessment report. The examples illustrated in the paper are excerpts from this SCAP benchmark development. A SCAP benchmark consists of (1) data streams, (2) eXtensible Configuration Checklist Description Format (XCCDF) [5], (3) Open Vulnerability Assessment Language

270

The 10th International Conference for Internet Technology and Secured Transactions (ICITST-2015)

(OVAL) [6], and (4) Common Platform Enumeration (CPE) [7] components. The remaining sections of the paper are organized as follows. In Section II, the related works are discussed. Section III presents methodology, tools used and testing, while, in Section IV, background about SCAP specifications, SCAP benchmark components, and benchmark development are discussed. The conclusion and future work are presented in Section V. II.

RELATED WORKS

In [11], challenges in maintaining IT security in day-today operations are addressed. It is shown that MITRE’s Making Security Measurable identifies an object of interest uniquely and provides a method to define the desired configuration of the target systems. MITRE also uses XCCDF, OVAL and Open Checklist Interactive Language (OCIL) to develop automated security checklist, vulnerability check and security configuration guidelines. NIST in conjunction with MITRE initiated SCAP, which constitutes seven specifications (i) XCCDF, (ii) OVAL, (iii) OCIL, (iv) CPE, (v) CCE, (vi) Common Vulnerabilities and Exposures (CVE), and (vii) Common Vulnerability Scoring System (CVSS). Authors in [8] used a SCAP based tool to read security checklist results from computers or devices and then translates it into logical objects. The logical objects can be combined logically to create a global model that represent the entire network. The logical objects can further be composed to perform a global analysis using more advanced security analytic tools such as ConfigChecker. In [9], the authors performed an analysis of automation possibilities in information security management. In their analysis, the authors considered the potential of using (i) security ontologies in risk management, (ii) hard- and software systems for the automatic operation of individual security controls, and (iii) SCAP for automatically checking compliance and security configurations. The analysis was performed to assist security managers with identifying systems where a greater efficiency in the information security management process can be achieved. Authors in [9] also studied the potential for automating security checks of software and hardware. However, no single tool supports the entire range of potentially automatable controls, and thus a combination of different tools needs to be used to achieve the security automation within an organization. Therefore, interoperability is essential to support communication between the various tools. According to [9], assessment of about 30% of security controls can be automated. Technical compliance checking is one of the areas of security assurance that can be automated. Security assessment that involve human intervention and decision-making cannot be fully automated. Cisco Inc., the world largest manufacturer of enterprise network devices, introduced OVAL content for Cisco devices and explained in [10] how OVAL helps with security automation. OVAL content for Cisco Internetwork Operating System (IOS) can be downloaded from Cisco Security

978-1-908320-52/0/$31.00 ©2015 IEEE

Advisories website. Moreover, Cisco provides a guidance on how OVAL content for Cisco IOS can be used to check vulnerabilities of Cisco devices. Cisco demonstrated vulnerability scanning and report generation with a SCAP validating too, jOval. Developing of robust and comprehensive security checklist requires the highest level of expertise in the particular technology. To avoid potential inconsistencies in SCAP specifications, the authors in [12] suggested an ontology-based approach as a means of providing a uniform vocabulary for specifying SCAP data and its relationship. Table I summarizes the problems in OVAL, CPE, and Common Configuration Enumeration (CCE) artifacts. TABLE I. Problem Implicit Rationale

SUMMARY OF INCONSISTENCY PROBLEMS IN SCAP SPECIFICATIONS Description

The implicit rationale may lead to inconsistencies in the artifact definition. The OVAL definition explicitly describes how a particular vulnerability may be executed. However, the definition does not explicitly define why the vulnerability exists.

This problem occurs when the relationships between SCAP Implicit Intradefinitions are implicit in the same specification. For definition example, two related vulnerabilities might be defined in two Relationships definitions within CPE specification. The different kinds of SCAP artifacts tend to operate in isolation instead of using CPE asset identifier explicitly. Implicit Inter- This problem occurs when OVAL, CVE, and CCE definition definitions implicitly encode semantics of asset Relationships dependencies in their definitions. For instance, the vulnerability of a router is identified in OVAL, but there is no definition for that router in the CPE definition.

Unclear Relationship Reuse

The reuse of relationships across different SCAP definition is not clear. For example, one OVAL definition identifies Red Hat Linux iptable is vulnerable and another OVAL definition reuse that definition and identifies that VMware ESX server has a “ipv4iptable” vulnerability. In fact, Linux kernel has been used to build Red Hat and VMware ESX.

Unclear Definition Hierarchy

SCAP definition is typically developed in isolation; it focuses on a single asset, threat, vulnerability or countermeasure, so it becomes difficult to inter-relate implicit information across multiple SCAP definition.

Definition Name Ambiguity

An additional name can be added to SCAP identifiers even if there is an existing one. Thus, ambiguities can arise in SCAP name/identifiers. For example, there are three entries of Ciscoubr7200router, namely ubr7200, Cisco_ubr7200, and Cable_Router_ubr7200.

Although, several studies had been carried out on SCAP and possibility of security control automation, some additional research are required. Thus, this study concentrates on SCAP benchmark development for automated Cisco router security configuration assessment.

271

The 10th International Conference for Internet Technology and Secured Transactions (ICITST-2015)

III.

METHODOLOGY

The following section outlines the methodology, Checklist Tiers, and tools used in the benchmark development. A. Methodology NIST’s National Checklist development Program (NCP) [13] is used as a reference for the benchmark development. NIST defined four Tiers of the checklist. Tier I and Tier II are prose and these are non-automated. Tier III checklist can be automated by using SCAP validated tools. The developer needs to submit the checklist to NIST for screening for the Tier IV checklist. Afterward, NIST will seek public review and feedback and then NIST will publish the checklist. The benchmark/checklist we developed falls under Tier III as the checklist can be automated. Table II summarizes the requirements for the checklist Tiers. TABLE II. Tier

Machine Readable

Tier I

No

Tier II

Yes

Tier III

Tier IV

CHECKLIST TIER REQUIREMENT SUMMARY [13] Automated Format N/A Non-standard (proprietary, productspecific, etc.)

Reference to Security Compliance Framework Optional Optional

Yes

Complete SCAPexpressed checklist that can be processed by SCAP-validated tools and runs cleanly using the SCAP content validation tool.

Optional

Yes

Complete SCAPexpressed checklist that can be executed by SCAP-validated tools; has been validated by NIST or NIST-accredited laboratory; and maps low-level security settings to high-level security requirements.

Required; must be vetted with at least one governance organization authoritative for the security compliance framework. Must demonstrate mapping capability from lowlevel enumerations (CCE) to high-level categorization (e.g. SP 800-53 controls).

Fig. 1 shows the steps involved in the benchmark development. The targeted environment for the benchmark is a managed environment [13] of medium to large organizations. The environment typically contains large organizational systems with defined suites of hardware and software configurations, usually consisting of centrally managed IT products.

Figure 1. Adopted NCP Checklist Development Steps

978-1-908320-52/0/$31.00 ©2015 IEEE

NSA’s router security configuration guide [3], DISA’s STIGs [2], and CIS’s Cisco IOS benchmark [4] are consulted as a security baseline. The authors also reviewed Cisco IOS OVAL contents. In this benchmark, there are 18 rules that have been developed to verify Cisco router security configuration. These rules are justified as essential for router security. B. Tools and testing While jOval is deployed as a SCAP content validated tool, XML codes are developed with Notepad++ in the benchmark development. The benchmark is tested on Cisco IOS 12.4 and 15.2 routers in the testing lab. The routers have a basic configuration with some security configurations implemented. Verifying router configuration manually for security compliance is a tedious job, and it is prone to human error. It might take about ten minutes to log-in to a router and verify configuration manually, and it takes more time for verification if a router has complex configurations on it. If an organization has many routers, this compliance check task will be a challenge for administrators or security auditors. The jOval can verify the configuration on routers with SCAP benchmark in an automated way. It takes less than a minute to check configuration on three routers. Fig. 2 displays the jOval report of the result of the assessment that includes 18 rules to be checked against with configuration on three routers. The flat unweighted scoring model is used in the assessment and the PASS threshold set to 12. Therefore, at least 12 rules must pass and then the report will show which device is passed or failed after the assessment is finished.

One of the routers failed the assessment as shown in Fig. 2. Organizations can set the PASS threshold in accordance with the security policy. Figure 2. Snippet of jOval report

IV.

SCAP BENCHMARK

Section IV outlines the introduction of the SCAP specifications and the detail of SCAP benchmark components. A. SCAP Specifications “The Security Content Automation Protocol (SCAP) is a suite of specifications that standardize the format and nomenclature by which software flaw and security configuration information are communicated, both to machines and humans. SCAP is a multi-purpose framework of specifications that support automated configuration, vulnerability and patch checking, technical control compliance activities, and security measurement. The SCAP content is used to compare system characteristics and

272

The 10th International Conference for Internet Technology and Secured Transactions (ICITST-2015)

settings against a SCAP-conformant checklist in an automated fashion” [1]. The SCAP content may include the XCCDF benchmark element that expresses the checklist for compliance checking. An OVAL definition is referenced in each the XCCDF benchmark rules. An OVAL component, which holds definitions of compliance checks used by the checklist, contains an OVAL definition.

Each data stream is a collection of links to the components that they reference; each logical link encapsulates the information required to allow the content consumer to connect the components together within the data stream [1]. Fig. 3 illustrates data streams point to different components.

B. SCAP benchmark components (1) SCAP data stream and data stream collections A SCAP data stream collection is comprised of SCAP data streams and SCAP source components each consisting of data expressed using one or more of the SCAP specifications. The data stream section contains one or more source data streams, each of which references the source Figure 3. Data stream collections components in the Components section that compose the data stream [1]. Fig. 4 illustrates the data stream links to CPE dictionary, an XCCDF, and an OVAL component. The XCCDF component links to OVAL and the CPE component links to same OVAL component. Figure 4. Sample of a data stream



TABLE III.

XCCDF ELEMENT IDENTIFIER FORMAT CONVENTIONS [5] Element

Benchmark Profile Group Rule Value TestResult Tailoring

Format Convention xccdf_namespace_benchmark_name xccdf_namespace_profile_name xccdf_namespace_group_name xccdf_namespace_rule_name xccdf_namespace_value_name xccdf_namespace_testresult_name xccdf_namespace_tailoring_name



(2) XCCDF Component XCCDF is one of the SCAP specifications for writing security benchmarks/checklists. The checklist description format intends to replace the traditional security guidance documents. XCCDF enables security professionals and system auditors to create the standardized security checklists, which in turn helps to improve system security by ensuring more consistent and accurate security practices. XCCDF permits greater automation of security testing and configuration assessment. XCCDF makes it possible to augment the textual representation of configuration advice with additional underlying configuration checking engines. The XCCDF data model consists of Benchmark, Item (Group, Rule, Value), Profile, Test Result, and Tailoring elements. Table III depicts the XCCDF element identifier format conventions. The namespace is a reverse domain name of the author’s organization (for example, com.abc). (a)

XCCDF Benchmark element

An XCCDF benchmarks is a container, which consists of , , , and . The “Platform” attribute of a “Rule” indicates the platforms to which the “Rule” applies. The CPE identifiers can be used

on

all

levels (benchmark, group, or rule) to control the applicability to certain systems platform. The platform to be checked, as shown in Fig. 5, is Cisco IOS version 15 family. Figure 5. Sample of XCCDF Benchmark element

(b) XCCDF Profile Element A profile contains the information about how groups, rules, and values should be used. A profile can be created based on security requirements such as high impact system

978-1-908320-52/0/$31.00 ©2015 IEEE

273

The 10th International Conference for Internet Technology and Secured Transactions (ICITST-2015)

or low impact system. The is a child element of and it affects which and elements are selected for processing when the is in effect. In Fig. 6, the group “xccdf_group_1” is selected for processing. Figure 6. Sample of XCCDF Profile element

(c) XCCDF Group element The XCCDF group contains , and element. The “Group” may also contain another group. The “id” attribute is used to identify the “Group”. Fig. 7 shows a sample of an XCCDF group element. Figure 7. Sample of XCCDF Group element

(i) XCCDF Rule element An essential unit of an XCCDF is a “Rule” that describes a certain check as part of a benchmark. The “Rule” can refer to an OVAL definition for a technical check. The “Rule” includes “id” attribute, severity, and weight. The severity defines the metric for the severity level, and the weight is the relative scoring value of a “Rule”. Figure 8. Sample of XCCDF Rule element

(ii) XCCDF Check element The “Check” element contains the “system” attribute that is the Uniform Resource Identifier (URI). The “system” attribute tells the compliance checking tool that the tool must use OVAL to interpret or execute the “Check” as shown in Fig. 9. The “href” attribute refers to the OVAL data location “Config_oval” within a benchmark document. Figure 9. Sample of XCCDF Check element

(3) OVAL Component The OVAL is an XML-based language, and OVAL definitions are machine readable. A definition schema describes the data expressing a particular machine state. The results schema is used for reporting the results of an assessment. After the definitions are evaluated, the results are presented. Fig. 10 shows Cisco IOS OVAL definition schema and OVAL schema.

Figure 10. Sample of OVAL Component

(i)

OVAL Definition element

show running-config

OVAL includes the definition, test, object, and state elements. The crucial element of OVAL is a definition, and Title of the group

each definition must have “id” associated with it. An OVAL definition contains criteria that check whether a system is in a particular state. The definition has the metadata that provides information about what is being checked and how the definition should be used. The essential elements of metadata are a title and a description. The criteria element outlines what is being tested. Criteria contain individual criterion statements that reference a single test. The criteria contain all the individual tests and join them together with AND, or OR operators. If the operator is AND, then a test referenced by the criteria must be true for the entire criteria to result in true. If the operator is OR, then one test referenced by the criteria must be true at least for the entire criteria to return true. The actual test is referenced and identified by the “test_ref” attribute. In Fig. 11, the test that will be performed is “oval:cue.oval:tst:1”. Figure 11. Sample of OVAL Definition element

The optional negate attribute implies that the result of the criteria as a whole should be negated during analysis. In above example, the result of the test is negated. Thus, the result of the criteria will be “TRUE” if the outcome of the test is “FALSE”. In compliance checking with a benchmark, if the OVAL test result is “TRUE”, the XCCDF result will be “PASS”. (ii) OVAL Test element



978-1-908320-52/0/$31.00 ©2015 IEEE

A “Test” is an OVAL element and is used to check the value of specified attributes related to a given object. The “Test” combines a reference to a particular object and a reference to the value to check. The “Test” is identified by an ID. The "check" attribute determines the relationship between the “Object” and the “State”. In Fig. 12, the “check”

274

The 10th International Conference for Internet Technology and Secured Transactions (ICITST-2015)

attribute sets to “at least one” and that means one of the declarations of the OVAL object element satisfies the values found in the “state”. Figure 12. Sample of OVAL Test element

(iii) OVAL Object element An OVAL test checks one or more objects. The “Object” elements are actual 'things' on the system being evaluated. In Fig. 13, the “running-configuration” on a Cisco router will be checked against the declaration that is defined in the “State” element. Figure 13. Sample of OVAL Object element

(iv) OVAL State element After defining the OVAL object, the next step is to express the “State” of that object for the test to be considered TRUE. In Fig. 14, a regular expression is implemented to check a certain pattern on a router’s running configuration. If any lines on the router’s configuration satisfy the regular expression that is defined in “”, the test result becomes “TRUE” else it becomes “FALSE”. Cisco Router Command Regex Matching

compliance assessment and helps organizations to save a significant amount of time for the compliance assessment. Using automated SCAP benchmark also reduces the likelihood of human error as compared to a manual check. Moreover, the benchmark can also be used as an auditing tool by IT auditors to check Cisco router security configuration. Additionally, a comprehensive vulnerability and compliance assessment checklist for Cisco routers can be developed by integrating this benchmark and the Cisco OVAL vulnerability checklist. In fact, the benefit of the

Figure 14. Sample of OVAL State element

(4) CPE Component

CPE is a critical component for verifying which operating system is running on a device. Before the OVAL test is processed, it needs to check the version of Cisco IOS on a router. If the version is not defined in CPE component, the OVAL test will not process. Example: If Cisco IOS version 15 family is defined in CPE and the SCAP benchmark runs against on routers with Cisco IOS version 12 family installed, it will show a “not applicable” message on the jOval report, and the OVAL test will not carry out. In Fig. 15, a test will perform on a router to check whether Cisco IOS 15.x is installed on it or not. Figure 15. Sample of CPE Component

V.

CONCLUSIONS AND FUTURE WORKS

The discussion and results outlined in the paper can help organizations to automate security control, understand the benefits of automated security control, and development of SCAP benchmark. The benchmark described in this paper enables organizations to automate Cisco router security automated SCAP benchmark is that it can be used for an assessment of a large number of devices immediately, once a benchmark is developed. One can add more tests to or remove tests from the SCAP benchmark based on the organization’s security policy. One might consider improving this benchmark by adding remediation commands to fix the non-compliance configuration. The examples outline a skeleton of SCAP benchmark components. The full version of the benchmark is available upon request. VI.

oval:cue.cpe:def:1 RegEx Pattern 978-1-908320-52/0/$31.00 ©2015 IEEE

REFERENCES

[1] "NIST SP 800-126 Revision 2: "The Technical Specification for the

Security Content Automation Protocol (SCAP)," 2011. [Online]. Available: http://csrc.nist.gov/publications/nistpubs/800-126-rev2/ SP800-126r2.pdf. [Accessed 12 Jan 2015].

[2] "Security Technical Implementation Guides (STIGs)," [Online].

Available: http://iase.disa.mil/stigs/Pages/index.aspx. [Accessed 16 Jan 2015].

[3] "National Security Agency: "Router Security Configuration Guide","

[Online]. Available: https://www.nsa.gov/ia/_files/routers/C4-040R02.pdf. [Accessed 23 Feb 2015].

[4] "Cisco IOS Security Configuration Benchmark: The Center of

Internet Security," 2012. [Online]. Available: https://benchmarks.cisecurity.org/ tools2/cisco/CIS_Cisco_IOS_Benchmark_v3.0.1.pdf. [Accessed 7 Feb 2015].

[5] "NIST-Specification for the Extensible Configuration Checklist

Description Format (XCCDF) Version 1.2," Sept 2011. [Online]. Available: http://csrc.nist.gov/publications/nistir/ir7275-

275

The 10th International Conference for Internet Technology and Secured Transactions (ICITST-2015)

rev4/NISTIR-7275r4.pdf.

[6]

MITRE, "Open Vulnerability and Assessment Language (OVAL)," [Online]. Available: https://oval.mitre.org/index.html.

[7] "NIST: Common Platform Emuneration (CPE)," [Online]. Available: http://scap.nist.gov/specifications/cpe/.

[8] M. N. Alsaleh and E. Al-Shaer, "SCAP based configuration analytics for comprehensive compliance checking," in in Configuration Analytics and Automation (SAFECONFIG), 2011 4th Symposium on , vol., no., pp.1-8, Arlington, VA, Oct. 31 2011-Nov. 1 2011.

[9] R. Montesino and S. Fenz, "Automation Possibilities in Information Security Management," Sept 2011. [Online]. Available: https://www.sba-research.org/wp-content/uploads/publications/ PID1947709.pdf.

omation.html. [Accessed 18 Jan 2015].

[11] G. Koschorreck, "Automated Audit of Compliance and Security

Controls," in in IT Security Incident Management and IT Forensics (IMF), 2011 Sixth International Conference on , vol., no., pp.137148, Stuttgart, 10-12 May 2011.

[12] W. M. Fitzgerald and S. N. Foley, "Avoiding Inconsistencies in the Security Content Automation Protocol," 2013. [Online]. Available: http://www.cs.ucc.ie/~simon/pubs/safeconfig2013.pdf.

[13] "NIST SP 800-70 Revision 2: "Guidelines for Checklist users and

developer"," 2011. [Online]. Available: http://csrc.nist.gov/ publications/nistpubs/800-70-rev2/SP800-70-rev2.pdf. [Accessed 12 March 2015].

[10] Cisco, "Security Automation Using OVAL," 2014. [Online]. Available: http://www.cisco.com/web/about/security/intelligence/oval_scty_aut

978-1-908320-52/0/$31.00 ©2015 IEEE

276

Suggest Documents