Testing Malicious Code Detection Tools - Semantic Scholar

15 downloads 2832 Views 578KB Size Report
Jul 22, 2003 - Testing Malicious Code. Detection Tools. Mihai Christodorescu [email protected]. WiSA http://www.cs.wisc.edu/wisa. University of Wisconsin ...
Testing Malicious Code Detection Tools Mihai Christodorescu [email protected]

WiSA http://www.cs.wisc.edu/wisa University of Wisconsin, Madison

Problem • Wrong focus: Testing looks only at today’s malware – What about tomorrow’s malware?

• Efficacy: How does one compare the efficacy of several malware detection tools? – Lack of openness about implementations

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

2

Our Solution Test Case Generation through Obfuscation • Generate new malware test cases – Using obfuscation transformations – Based on existing malware instances

• Test detection capabilities across a wide range of malware types

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

3

Milestones Binary rewriting infrastructure – For IA-32/Windows – For Visual Basic

Suite of obfuscations Comparison metrics Complete test suite 22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

4

Overview • • • • • •

Goals State of the art Our approach Testing environment Evaluation Conclusions

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

5

Testing Malware Detectors Malware Detection Tool’s Goal: Detect malicious code! • Focus: executable code that replicates – Viruses, worms, trojans – Not buffer overflow attacks – Not spyware

• Code is mobile 22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

6

Testing Goals 1. Measure detection rate for existing malware t False negatives

2. Measure detection rate for benign software t False positives

3. Measure detection rate of new malware t Resilience to new malicious code

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

7

State of the art • Several testing labs – Commercial • Virus Bulletin • International Computer Security Association (ICSA) • West Coast Lab’s CheckMark

– Independent • University of Hamburg Virus Test Center (VTC) • University of Magdeburg

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

8

Sample certification req’s Virus Bulletin 100% Award • Detect all In The Wild viruses during on-demand and on-access tests • Generate no false positives ITW virus lists are maintained by WildList Organization International 22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

9

Testing Goals 1. Measure detection rate for3existing Checked malware t False negatives

2. Measure detection rate of benign 3 Checked software t False positives

??? 3. Measure detection rate of new malware t Resilience to new malicious code

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

10

Testing against Future Malware • First attempt: Andreas Marx “Retrospective Testing” – Test 3- and 6-month old virus scanners Malware Type

Macro viruses Script viruses Win32 file viruses Win32 worms, trojans, backdoors 22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

Detection Rate

74% - 94% 35% - 82% 24% - 79% 8% - 37% 11

Testing against Future Malware • We can learn from the past: – Often, old malicious code is slightly changed and re-launched

• Sobig e-mail worm Sobig.C

Sobig.D

Sobig.E

Fake “From”

[email protected]

[email protected]

[email protected]

Distribution

Mass e-mail, Copy to shared drive

Mass e-mail, Copy to shared drive

Mass e-mail, Copy to shared drive

Deactivation

June 8

July 2

July 14

Update path

Geocities-hosted page

Hard-coded IPs

Hard-coded IPs

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

12

Overview • • • • • •

Goals State of the art Our approach Testing environment Evaluation Conclusions

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

13

Obfuscation = Test Case Generation • Obfuscate current known malware to obtain new malware test cases • Why? – Many viruses / worms / trojans reuse code from older malware – Simulate self-mutating malware – Measure the ability of anti-virus tools to detect malicious behavior, not just malicious code instances

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

14

Test Case Generation Collection of current malware

V1 V2 . . .

New malware for testing

Set of obfuscation transformations

x

σ1 σ2 ... σm

V’1,1 V’2,1

=

.

. .

. .

.

. . .

Vn

V’n,1

V’1,m V2,m . . .

.

.

.

V’n,m

...

Garbage Insertion

Interpreter Code Reordering

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

15

Test Case Generation Collection of current malware

V1 V2

Set of obfuscation transformations

σ1 σ2 ... σm

x

. . .

New malware for testing

=

Vn

V’1,1 V’2,1

Binary

Build CFGs

. .

. .

.

. . .

V’n,1 IDA Pro Parse Binary

.

V’1,m V2,m . . .

.

.

.

V’n,m

Connector Memory Analysis BREW Rewrite

Library of obfuscations

22 July 2003

Generate Code

Generated Binary

Mihai Christodorescu WiSA http://www.cs.wisc.edu

16

Sample Obfuscations • Change data:

– New strings, new dates, new constants – Encode / encrypt constants

• Change control: – – – – –

Insert garbage Encode / encrypt code fragments Reorder code Add new features …

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

17

Sample Obfuscations - Encode / encrypt constants Message= Decode(“13FQ...”) Message=“Read this..”

Message.Send

Message.Send

... Sub Decode(...) ...

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

18

Earlier Obfuscation Results Commercial anti-virus tools vs. morphed versions of known viruses

Chernobyl-1.4

2 Not detected

2 Not detected

2 Not detected

f0sf0r0

2 Not detected

2 Not detected

2 Not detected

Hare

2 Not detected

2 Not detected

2 Not detected

z0mbie-6.b

2 Not detected

2 Not detected

2 Not detected

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

19

Ideal Testing Results

V irus σ1(A ) σ2(A ) σ3(A ) A σ4(A ) σ5(A ) σ6(A) σ7(A ) σ8(A )

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

McA f ee A V Command A V Norton A V

20

Parametrized Obfuscation Encoding data:

Message=Decode(“13FQ...”)

Message=“Read this..”

Message.Send

Message.Send

... Sub Decode(...)

... Parameters: • Data to obfuscate • Type of encoding / encryption • Encryption key • Location of obfuscation

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

21

Testing Environment • Simple and complex obfuscations applied to known (detected) malware • Multiple malware detection tools: – – – –

McAfee Norton Sophos Kasperski

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

22

Overview • • • • • •

Goals State of the art Our approach Testing environment Evaluation Conclusions

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

23

Evaluation • Test against a set of malware detected in original form by all tools • Test using the same obfuscations and the same parameters • Obfuscation hierarchy – From simple to most complex

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

24

Metrics • Minimum obfuscation level: For a given obfuscation σ with parameters (x1, ..., xk), what are the least values for each xi that generates a false negative?

• Minimal combination of obfuscations: What is the smallest set of obfuscations {σ 1, ..., σ k} that generates a false negative?

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

25

22 July 2003

1

29 de

co En

En

co

de

25

21 de

co En

co

de

de co En

17

13

9 de co

En

co

de

de En

co

En

McAfee AV Command AV Norton AV

5

En

A

nn

aK

ou

rn

ik o

va

Preliminary Results

Mihai Christodorescu WiSA http://www.cs.wisc.edu

26

Lessons Learned • Malware spreads instantaneously – Arms race between malware distribution and malware detection tool update

• In testing malware detection tools, one must use a virus writer’s mindset – It is a game t need to think several moves ahead

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

27

Future Work • Explore more obfuscations Depends on results of ongoing tests

• Future idea # 1: Self-guided test tool that finds the minimal test cases for false negatives

• Future idea # 2: Develop tests to check for detection of malicious behavior, not just code sequences

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

28

Seeing Through the Obfuscations Smart Virus Scanner Pattern Library

Program to analyze

Annotator

Annotated Program

Malicious Code Blueprint

Detector

Yes/No

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

29

Detection Example Virus Automaton:

Irrelevant instruction

Virus Code: push sidt pop add cli mov mov lea push mov shr mov pop

eax [esp-02h] ebx ebx, HookNo * 08h + 04h

mov X, Y

Irrelevant instruction

ebp, [ebx] bp, [ebx-04h] esi, MyHook - @1[ecx] esi [ebx-04h], si esi, 16 [ebx+02h], si esi

mov X1, Z

Irrelevant instruction

(from Chernobyl CIH 1.4 virus)

lea A, B Virus Found!

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

30

References Mihai Christodorescu, Somesh Jha “Static analysis of executables to detect malicious patterns”. USENIX Security’03, August 2003, Washington DC.

Andreas Marx “Retrospective testing - how good heuristics really work”. Virus Bulletin Conference, November 2002, New Orleans, LA.

Sarah Gordon “Antivirus software testing for the year 2000 and beyond”. 3rd NISSC Proceedings, October 16-19, 2000, Baltimore, MD.

22 July 2003

Mihai Christodorescu WiSA http://www.cs.wisc.edu

31

Binary Code Rewriting Binary

IDA Pro

Clients

Parse Binary

Detect Malicious Code

Build CFGs

Connector

Codesurfer

Memory Analysis

Build SDG

BREW Rewrite Generate Code

22 July 2003

Browse

Detect Buffer Overrun Build Program Specification

Generated Binary

Mihai Christodorescu WiSA http://www.cs.wisc.edu

32