Automated Security Configuration Checklist for a Cisco IPsec VPN ...

11 downloads 85054 Views 401KB Size Report
configuration on a Cisco router configured for IPsec VPN, per. baseline (i.e. configuration compliance checking). This. enabled rapid security assessment using ...
The 10th International Conference for Internet Technology and Secured Transactions (ICITST-2015)

Automated Security Configuration Checklist for a Cisco IPsec VPN Router using SCAP 1.2 Gabriel Biedima Peterside, Pavol Zavarsky, Sergey Butakov Information Systems Security Management Concordia University of Edmonton, Edmonton, Alberta, Canada [email protected], {pavol.zavarsky,sergey.butakov}@concordia.ab.ca Abstract—For large enterprises running many different operating systems, applications, and multi-vendor devices, the task of reviewing the security state of a broad range of devices and business areas in order to either comply with security requirements from regulations or detect risks such as misconfigured devices, out-of-date software, etc., is timeconsuming, error-prone, and expensive. Although humans are important in the security assessment process, they are unable to keep up with the task, and may introduce inconsistencies which could further make organizations vulnerable to security breaches. Security automation provides a solution to this challenges. In this paper, a common security automation protocol, Security Content Automation Protocol (SCAP) version 1.2, was leveraged to develop an automated secure configuration checklist which can be used by security professionals to rapidly and consistently audit network edge devices such as a Cisco IPsec VPN router to ensure secure configuration per the baseline. Keywords— SCAP; Security Automation; IPsec VPN

I.

INTRODUCTION

Managing security of information systems especially for large enterprises with many different systems from multiple vendors, different applications, and different flavors’ of operating systems with different mechanisms for secure configuration management and patching, is challenging for organizations and security professionals. The emergence of virtualization technology, for example, further increases the problem as security professionals must now ensure security of virtual machines, guest operating systems running on them, and guest applications, in addition to the physical device. The process of reviewing security posture in such large enterprises is time-consuming, error-prone, taxing, and often results in inconsistencies, and consequently new risks, since humans have to manually carry out such tasks. Coupled with the fact that attackers may exploit unpatched vulnerabilities anytime, time is also of the essence in ensuring a good security posture. These challenges may be solved by automating the process of verifying the security state of information systems based on defined security policies or baseline – a process known as Security Automation. Security Automation is the use of standardized specifications and protocols to perform common security functions such as patch management, inventory management, vulnerability assessment, compliance checking, etc. [1]. It is a set of technologies and processes designed to automatically handle routine tasks, detect and remediate

978-1-908320-52/0/$31.00 ©2015 IEEE

vulnerabilities, expedite response to known threat, deliver essential information when needed, and allow information security professionals to focus on hard problems [2]. One common security automation protocol used by the U.S Federal Agencies, and also supported by major players such as Microsoft, Cisco Systems, etc, is the Security Content Automation Protocol (SCAP). The SCAP, developed by NIST, is a suite of specifications that standardize the format and nomenclature by which software flaw and security configuration information is communicated, both to machines and to humans [3]. This work was motivated by the fact that the checklists/Security Technical Implementation Guides (STIGs) for network edge devices currently available to the public via the NIST National Checklist Program, the NSA’s Security Configuration Guides repository, the Center for Internet Security, and the Defense Information Systems Agency (DISA) are not automated; therefore, making it burdensome for large organizations to quickly determine their security posture in order to identify risks such as devices configured insecurely. Also, under the Federal Information Security Management Act (FISMA) of 2002. Title III of the EGovernment Act (Public Law 107-347), Federal organizations must report annually to the Congress and to the Office of Management and Budget (OMB) on the adequacy and effectiveness of their information security policies, procedures, and practices [4]. The SCAP 1.2 specifications, Extensible Configuration Checklist Description Format (XCCDF) and the Open Vulnerability and Assessment Language (OVAL) were used in the automated checklist developed for use in verifying the configuration on a Cisco router configured for IPsec VPN, per baseline (i.e. configuration compliance checking). This enabled rapid security assessment using SCAP-Validated products, and for demonstration of compliance as and when required. The automated checklist was developed following a similar approach used in traditional software development in which development goes via the following phases listed below. x

Requirements analysis

x

Design

x

Implementation

x

Testing

355

The 10th International Conference for Internet Technology and Secured Transactions (ICITST-2015)

The result of this automated checklist development enables organizations determine whether their network edge routers configured for IPsec VPN meets the security baseline, quickly demonstrate compliance to regulations, and bring noncompliant devices into compliance using recommendations provided with the assessment result. The main contribution of this paper is a SCAP 1.2 automated checklist used for checking the compliance of a Cisco IPsec VPN router with the baseline. In addition, test and validation results are discussed briefly. The remaining sections of the paper are organized as follows. In Section II, we look at related research work, in Section III, the methodology used is discussed, while the architecture of the SCAP Content developed is discussed in Section IV. In Section V, test results are discussed. II.

RELATED WORK

Existing work related to automated security audit focused on security automation approaches [5], automation possibilities in information security management [6] [7], how SCAP enables organizations verify the secure state of their systems [4], and recently, an automated secure configuration benchmark for a Cisco router using SCAP. In [5], two initiatives are provided for solving the challenges faced by large organizations and security professionals, with more emphasis on the SCAP version 1.1 standards such as OVAL, OCIL and XCCDF, and how they can be used to bridge the gap between the demands of security and complexity of the IT infrastructure. Although useful code snippets using the OVAL specification are provided, they are tailored towards operating systems such as Windows and are based on an older version of SCAP, version 1.1. On the other hand, [6] shows that although security automation eases the burden of security assessment, automating every aspect of it is not possible since it involves technology, people, and processes. For instance, awareness and training cannot be automated because it involves humans; however, automation should be leveraged as much as possible in order to reduce the time necessary to detect security flaws, etc. In [4], recommendations on how organizations can adopt SCAP are provided through various use cases and specifications. III.

METHODOLOGY

The methodology used in this paper was accomplished through the phases as follows. A. Requirements Analysis In this stage, a baseline based on security configuration guides from the National Security Agency (NSA), National Institute of Standards and Technology (NIST), and Cisco Systems (Next Generation Encryption) was defined in line with the typical security requirements of an organization. The baseline formed the basis of the rules defined in the automated checklist using XCCDF. The tools used were also identified in this phase.

978-1-908320-52/0/$31.00 ©2015 IEEE

B. Design Based on the security requirements in Section III, subsection A above, and the SCAP 1.2 Content Requirements and Recommendations defined in the Technical Specification for SCAP Version 1.2, a pseudocode was developed to be used in the implementation phase. Also, a test plan was created to guide testing. C. Implementation The checklist was developed per the design, using XCCDF and OVAL. Also, an initial test and validation using a SCAPValidated product such as jOVAL, and the SCAP Content Validation Tool from NIST was carried out. SCAP-Validated products (SCAP content consumers) are a list of products that have been validated by NIST as conforming to the SCAP and its component standards. On the other hand, the SCAP Content Validation Tool is designed to validate the correctness of a SCAP data stream for a particular use case according to what is defined in SP 800-126 [8]. D. Testing The automated checklist was tested using the test plan developed in the design phase, a SCAP-Validated Product (jOVAL), and a Cisco IPsec VPN router, to ensure the requirements have been met. Test results were also discussed briefly. IV.

PROPOSED SCAP CONTENT ARCHITECTURE

To automate compliance checking, a baseline (usually based on the organization’s security requirements) which devices will be verified against was defined. This baseline was then translated using the ‘Checklist’ language (XCCDF) to a checklist (rules) in XML and thus, could be processed by SCAP-Validated products. Since the checklist cannot interact directly with the low-level settings of the device, the ‘check’ language, OVAL, must be invoked. SCAP-Validated Tool XCCDF (encapsulated security policy/guidance) OVAL (assessment instructions) System Settings (registry key, hash, etc)

Figure 1. XCCDF interaction with the system [9]

Fig. 1 shows a high-level of how the checklist in XCCDF interacts with the system (router). Each layer builds on the layer below it. A. Security Baseline A baseline is a minimum level of security that a system, network, or device must adhere to [10]. The security parameters that make up the baseline are shown below. IPsec Protocol “Global” Configuration Parameters: x

ESP Tunnel Mode – This is the default mode and provides encryption and integrity protection,

356

The 10th International Conference for Internet Technology and Secured Transactions (ICITST-2015)

complicates attempts to perform traffic analysis, and is compatible with NAT [11].

TABLE I.

ISAKMP SA SECURITY PARAMETERS (IKE PHASE 1) [12]

Security Parameters Authentication

PSK

Encryption

3DES

HMAC

SHA-1

Diffie-Hellman

Group 2

Lifetimes

86,400 seconds

Rationale Provides acceptable security [13]

used. The data stream references a checklist component (XCCDF) which has a set of rules based on the baseline, a checking system (OVAL) invoked by the checklist to carry out the test. It also references a CPE dictionary component (cpe dict) which contains information about the Cisco router, and a ‘cpe-oval’ component that provides information to the SCAP tool on how to check if the router being assessed meets the defined specification.

Provides marginal but acceptable security level [13] and is FIPSapproved [11] Provides marginal but acceptable security [13] Recommended by the NSA, NIST and Cisco Systems [11] [13] [14] Recommended by [11] and [14]

Data stream collection xccdf oval data stream cpe oval cpe dict

TABLE II.

IPSEC SA SECURITY PARAMETERS (IKE PHASE 2) [12]

Security Parameters Encryption

AES

Provides adequate security [13]

HMAC

SHA-1

Lifetimes

1,800 seconds

Provides adequate security Lifetime of 30 mins (1800s) improves the security of legacy algorithm, and is recommended by [13] [11]

B. SCAP Data Stream Design SCAP 1.2 introduced the concept of a Datastream which is a combination of SCAP specifications/components such as OVAL, XCCDF, CPE, OCIL, etc, used together for particular functions (i.e. use cases), such as security configuration checking, vulnerability assessment, inventory management, etc. [8]. Multiple data streams along with their components constitute a SCAP source data stream collection [8] shown in Fig. 2 below. Thus, the checklist we developed for the ‘Configuration’ use case in this research, works with SCAP specifications/components such as OVAL and CPE, and are reflected in the design shown in Fig. 3. I.

components

Rationale

SCAP Data Stream Collection

Figure 3. SCAP data stream design

At the heart of the configuration compliance checking is the OVAL check system [8]which uses Cisco IOS schema tests [15] to verify the configuration of the target device (i.e. a Cisco IPsec VPN router). The OVAL tests, ‘line_test’ and ‘version55_test’ were used in the checklist developed. The ‘line_test’ checks the properties of specific output lines from a ‘show’ command, such as ‘show running-config’. The required object element references a ‘line_object’ and the optional ‘state’ element specifies the data to check [15]. On the other hand, the ‘version55_test’ was used to check the version of the internetwork operating system (IOS) running on the router. The required object element references a ‘version_object’ and the optional ‘state’ element specifies the data to check. An illustration of how they were used is shown in the snippet provided Fig. 4 below. In Fig. 4, the ‘’ element on lines 1 and 6 provides a container for one or more OVAL tests. Line 3 is a reference to an OVAL object that specifies which system data to collect, while line 4 is a reference to the expected state of the collected system data.

Data stream collection xccdf1 data stream 1

xccdf2

oval1 oval2 data stream 2

Figure 4. OVAL test structure code snippet [16]

cpe dict1 cpe dict2

data streams

1 2 3 4 5 6

components

Figure 2. SCAP data stream collection [8]

In Fig. 2, ‘data stream 1’ and ‘data stream 2’ can reference any component within the same data stream collection. The Fig. 3 below shows at a high-level, the data stream design

978-1-908320-52/0/$31.00 ©2015 IEEE

II.

Test Evaluation The accuracy of the result obtained from the device configuration compliance checking is largely based on the test evaluation of the checking system. The result of the OVAL test evaluation is determined by combining the results of the following three test evaluation parameters [17]: x

Existence Check Evaluation – The process of determining whether or not the number of OVAL

357

The 10th International Conference for Internet Technology and Secured Transactions (ICITST-2015)

Items, that match the specified OVAL object, satisfies the requirements specified by the ‘check_existence’ property x

x

Check Evaluation – The process of determining whether or not the number of collected OVAL Items, specified by the ‘check’ property, match the specified OVAL states. State Operator Evaluation – The process of combining the individual results, from the comparison of an OVAL Item to the specified OVAL States, according to the ‘state_operator’ property.

1 2 3 4 5 6 7 8 9 10 11 12 13 14

Figure 5. OVAL test evaluation code snippet

In Fig. 5 above, the individual results of both tests in the ‘’ container (i.e. lines 6 – 9 and lines 10 – 13), obtained by doing a ‘check’ and ‘check existence’ evaluation, is evaluated by the logical ‘AND’ operator in line 1 to obtain the final test result which shows whether or not the device meets the baseline. The code snippet in Fig. 6 ties key data stream components together. 1 2 3 4 6 7 8 9 10 11 12 14 15 16

Figure 6. Data stream snippet showing key components

In Fig. 6, the checklist component, ‘checklist.xml’ uses an XML catalog element to reference ‘oval.xml’, with its SCAP content accessible through the external URI ‘ovalcompid’.

The tools used in this research are listed in the Table III below. TABLE III.

TOOLS USED

Tool

Vendor

Version

SCAP-Tool (jOVAL)

Joval

5.11.1-1

Cisco 2811 router

Cisco Systems

12.4, Advanced Security IOS

SCAP Content Validation Tool

NIST

1.2

Source Code Editor

Notepad++

6.8.1

Laptop running Windows 7 and 8.1 OS

Dell

6.3.9600 Build 9600

The test plan, as shown in Table IV, allowed for testing the checklist using various test cases to identify and fix any errors, as well as validate our claims. The last three columns of the test plan in Table IV are blank intentionally, to be populated after each test. The Expected Outcome column is used to record the result expected for a particular test case per design, while the Actual Outcome column is for describing the real result obtained. Remarks, if any, are made in the last column. TABLE IV. Test ID

1 2 3 4 5 6 7 8 9 10 11 12 13 14

Test

TEST PLAN Expected Outcome

Actual Outcome

Data stream (checklist) validates with the SCAP Content Validation Tool jOVAL can communicate with IPsec VPN Router Data stream validates upon import to jOVAL IKE Phase 1 – Encryption Configuration IKE Phase 1 – Hash Configuration IKE Phase 1 – Authentication Configuration IKE Phase 1 – DH Group Configuration IKE Phase 1 – Lifetime Configuration IKE Phase 2 – Transform Set Configuration IKE Phase 2 – IPsec Mode Configuration IKE Phase 2 – IPsec Lifetime Configuration Tests will pass per baseline Configuration assessment of unsupported devices fail Remediation guidance provided to bring devices to compliance

The automated checklist is available upon request. III.

Test Plan and Tools Used

978-1-908320-52/0/$31.00 ©2015 IEEE

358

Remarks

The 10th International Conference for Internet Technology and Secured Transactions (ICITST-2015)

C. SCAP Data Stream Implementation In developing the checklist, the following steps were outlined (in no particular order) as pseudocode before actual development using the SCAP specifications.

The logical ‘OR’ and ‘AND’ operators provided much flexibility in the evaluation of test results in that tests could be evaluated individually or as a group, for a more accurate result. The ‘negate’ keyword was also very useful in test evaluations for the IKE Phase 1 DH Group Configuration and Lifetime Configuration as some test results had to be ‘negated’ in order for them to pass per the baseline.

x

Define data stream use case as ‘Configuration’

x

Identify OVAL schema tests for Cisco IOS

x

Examine secure configuration guides for IPsec from NIST, Cisco, the NSA, etc.

x

Translate security baseline to XCCDF

Although no false positives were identified in the course of the testing, it is not impossible for false positives to be obtained in future/over time, for instance due to router software bugs, deprecated OVAL tests, etc.

x

Translate checks/compliance tests to be run on the target device to OVAL

The automated checklist developed in this research is available upon request by contacting the first author.

x

Invoke OVAL component (test) from XCCDF

x

Identify CPE name for the target device and develop CPE test

x

Create CPE dictionary and CPE OVAL components

x

Create data stream for the SCAP components

x

Develop test plan

x

Validate SCAP data stream for correctness using the SCAP-Validation tool from NIST and the SCAPValidated product, jOVAL during import.

x

Test checklist against target device(s) and note results V.

TEST RESULTS AND DISCUSSION

The test plan developed in the design phase of the research was used for the purpose of testing against a Cisco 2811 IPsec VPN router and is based on the defined baseline. In the testing, typical scenarios in the production environment were simulated. In addition, tests were also carried out by an external party not involved in the research in order to ensure the consistency and accuracy of test results. The testing followed three general phases: x

Phase 1 – SCAP content validation

x

Phase 2 – jOVAL connectivity to router

x

Phase 3 – IPsec router assessment per baseline

The tests showed how leveraging security automation for secure configuration checking ensures consistency of security parameters configured, thereby reducing errors due to manual configuration by humans. Most importantly, misconfigurations were picked up by the checklist upon assessment of the router and reported as a ‘FAIL,’. For instance, in the IKE Phase 2 - IPsec Lifetime Configuration test, if the lifetime configured was anything other than 1800 seconds, the test failed since it did not meet the baseline. Also, for IKE Phase 1 – DH Group Configuration, the test failed when the default DH Group (group 1) was configured, because it did not meet the baseline as defined in the XCCDF rule. Overall, the tests passed as long as the router configuration for that security parameter met the baseline.

978-1-908320-52/0/$31.00 ©2015 IEEE

VI.

CONCLUSION AND FUTURE WORK

An automated checklist was developed for verifying the configuration of a Cisco IPsec VPN router against the baseline. The test results were discussed in Section V and meet the defined baseline. Due to limited access to real hardware, extensive testing was not possible thus, it was not possible to compare results across multiple IPsec VPN routers. Security automation using SCAP 1.2 data stream is a “must-have” item for large enterprises looking to quickly verify the security state of their information systems; however, the learning curve is steep. The checklist developed currently verifies devices and provide recommendations on how to bring them into compliance, if they are non-compliant to the baseline. In addition to assessing the router for misconfigurations, the Open Checklist Interactive Language (OCIL) component of SCAP can be leveraged for automated remediation. For instance, the OCIL can be integrated into the automated checklist to capture human input in the form of authorization from the change advisory board, before proceeding with remediating non-compliant devices using OVAL. Finally, the checklist can be used to scan routers running IOS version 12.x but not version 15.x which is the latest because we had no access to routers running that software version. Also, it does not check the device to ensure it’s a Cisco IPsec VPN router, before checking its configuration for compliance. Thus, this could result to a false positive if a nonIPsec VPN router is scanned. ACKNOWLEDGMENT We thank God for His grace in making this research possible. We also thank our family and friends for their encouragement and support including, Mr. Chamberlain Peterside, PhD, and the RSSDA for their support. REFERENCES [1] [2]

G. Witte, Security automation essentials: streamlined enterprise security management & monitoring with SCAP, New York: McGraw-Hill, 2012. S. Hanna and D. Waltermire, "Security Automation Webinar: Protecting Your Enterprise with Security Automation," 15 May 2013. [Online]. Available:https://www.trustedcomputinggroup.org/files/resource_files/A 9AA1AE4-1A4B-B294D0D4F40E60A181C2/Security%20Automation%20Webinar_2013%200 5%2015.pdf. [Accessed 12 October 2015].

359

The 10th International Conference for Internet Technology and Secured Transactions (ICITST-2015)

[3]

National Institute of Standards and Technology, "NIST Solicits Comments on the Security Content Automation Protocol (SCAP)," August 2015. [Online]. Available: http://csrc.nist.gov/publications/drafts/800-126/sp800-126r3_call-forcomments.html. [Accessed 2 September 2015]. [4] S. Radack, "Security Content Automation Protocol (SCAP): Helping organizations maintain and verify the security of their information systems," September 2010. [Online]. Available: http://csrc.nist.gov/publications/nistbul/september2010-bulletin.pdf. [Accessed 26 October 2015]. [5] G. Koschorreck, "Automated audit of compliance and security controls," in 2011 Sixth International Conference on IT Security Incident Management and IT Forensics, Bensheim, 2011.4 [6] R. Montesino and S. Fenz, "Automation possibilities in information security management," in 2011 European Intelligence and Security Informatics Conference, Athens, 2011. [7] P. Dwivedi and S. C. Diana, "Analysis of automation studies in the field of information security management," International Journal of Engineering Research and Development, vol. 6, no. 12, pp. 60-63, 2013. [8] NIST, "SCAP Specifications," National Institue of Standards and Technology, 8 April 2015. [Online]. Available: http://scap.nist.gov/revision/1.2/. [Accessed 30 September 2015]. [9] The MITRE Corporation, "XCCDF Introduction Handout," [Online]. Available: https://msm.mitre.org/docs/xccdf-intro-handout.pdf. [Accessed 26 November 2015]. [10] M. Gregg, CISSP Exam Cram 2 (3rd Edition), Pearson Education Inc., 2013, p. 5. [11] S. Frankel, K. Kent, R. Lewkowski, A. D. Orebaugh, R. W. Ritchey and S. R. Sharma, "Guide to IPsec VPNs - NIST SP 800-77," December

978-1-908320-52/0/$31.00 ©2015 IEEE

[12]

[13]

[14]

[15]

[16]

[17]

[18]

2005. [Online]. Available: http://csrc.nist.gov/publications/nistpubs/80077/sp800-77.pdf. [Accessed 18 September 2015]. Cisco Systems, "IPsec WAN Design Overview," [Online]. Available: https://www.cisco.com/application/pdf/en/us/guest/netsol/ns171/c649/cc migration_09186a008074f22f.pdf. [Accessed 24 September 2015]. Cisco Systems, "Next Generation Encryption," October 2015. [Online]. Available:http://www.cisco.com/web/about/security/intelligence/nextgen _crypto.html. [Accessed 26 November 2015]. V. Antonie, R. Bongiorni, A. Borza, P. Bosmajian, D. Duesterhaus, M. Dransfield, B. Eppinger et. al. , "Router Security Configuration Guide," 15 December 2005. [Online]. Available: https://www.nsa.gov/ia/_files/routers/C4-040R-02.pdf. [Accessed 22 October 2015]. The MITRE Corporation, "Version 5.10.1 - Test Listing," 2015. [Online]. Available: http://oval.mitre.org/language/version5.10.1/test_listing.html#IOS. [Accessed 26 November 2015]. The MITRE Corporation,, "OVAL Definition Tutorial," The MITRE Corporation, 18 January 2011. [Online]. Available: https://oval.mitre.org/language/about/definition.html. [Accessed 01 October 2015]. J. Baker, M. Hansbury and D. Haynes, "The OVAL Language Specification Version 5.10.1," 20 January 2012. [Online]. Available: http://oval.mitre.org/language/version5.10.1/OVAL_Language_Specific ation_01-20-2012.pdf. [Accessed 25 November 2015]. W. Jackson, "Security Automation: Are humans still relevant?," 24 July 2014. [Online]. Available: http://gcn.com/blogs/cybereye/2014/07/humans-vs-automation.aspx. [Accessed 06 October 2015].

360

Suggest Documents