2013 NSS Labs, Inc. All rights reserved. 2. Figure 1 – Mean Block Rate for
Phishing. Generally available software releases were used in all cases. The
tested ...
CONSUMER AV / EPP COMPARATIVE ANALYSIS Phishing Protection 2013 – Randy Abrams, Jayendra Pathak, Orlando Barrera
Tested Vendors Avast, AVG, Avira, F-‐Secure, Kaspersky, McAfee, Microsoft, Norman, Norton, Panda, Total Defense, Trend Micro
Overview Between September 29 and October 8 2012 NSS Labs performed a comprehensive test of the phishing protection offered by leading endpoint product (EPP) security suites against NSS phishing protection testing methodology v2.0. This report is based upon empirically validated evidence gathered by NSS over 37 discrete test runs. Testing was performed every 6 hours over a 10-‐day period. New phishing URLs were added to each test run over the entire test period. Each EPP product was updated to the most current version available at the time testing began and allowed access to the live Internet. Phishing attacks are one of the most common and impactful security threats facing users today. In the past, NSS has evaluated endpoint protection product capabilities against pervasive threats such as exploits and socially engineered malware. In this test, NSS evaluates the ability of endpoint products to block phishing attacks. The average phishing URL block rate for these products over the entire 10-‐day test period ranged from 92% for Trend Micro to 3% for Norman. Instantaneous product block rates were generally erratic and varied by as much as 75% for specific points in time.
NSS Labs
Consumer AV / EPP Comparative Analysis – Phishing Protection
Trend Micro Kaspersky AVG Norton McAfee Avast Total Defense F-‐Secure Panda Avira Norman
92% 85% 64% 63% 61% 55% 30% 25% 25% 19% 3% 0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Figure 1 – Mean Block Rate for Phishing
Generally available software releases were used in all cases. The tested products used vendor defaults for the 1 configuration . As shown above, few EPP products currently have anti-‐phishing technologies that could be considered highly effective. A deeper look later in the analysis will show that inconsistent blocking over time means that block rates are frequently far less at any given moment than indicated by the average rate.
Key Findings: •
Nearly 90% of consumers are inadequately protected against phishing by endpoint protection products.
•
The effectiveness of AV products claiming to offer phishing protection ranges from 3% to 92%.
•
Web browser phishing protection is almost always more effective than the protection offered by endpoint security products.
•
With current web browsers offering 90% to 94% protection against phishing, the added value of phishing protection in EPP suites is not a significant product selection criterion. Exploit and socially engineered malware protection as well as general malware detection capabilities are the key protection differentiators for antimalware suites.
1
ESET Smart Security 5.0 was not included in this consumer anti-‐phishing comparative. Interactions between the firewall in ESET Smart Security 5.0 and the NSS phishing test harness precluded testing that adhered strictly to the NSS methodology. Disabling the firewall in ESET Smart Security 5.0 also disables the web access and anti-‐phishing protection in the product. Because ESET product marketing does not include anti-‐phishing as a capability of the product, NSS took the decision to publish this test without ESET’s participation, and will work to resolve interoperability issues to allow inclusion of ESET's product in future tests. ESET Smart Security version 6.0 is in beta and is marketed as providing phishing protection. NSS looks forward to evaluating the new product in future tests.
© 2013 NSS Labs, Inc. All rights reserved.
2
NSS Labs
Consumer AV / EPP Comparative Analysis – Phishing Protection
•
Users who educate themselves about phishing attacks generally require little or no technology to protect against standard phishing attacks.
Recommendations: End users should: •
Use current web browsers as a first line of protection against phishing attacks.
•
Invest time in understanding phishing attacks and modifying behavior to avoid becoming a victim.
•
Assign a higher priority to exploit prevention, socially engineered malware blocking, and general detection capabilities over phishing detection when selecting EPP products.
This analysis brief was produced as part of NSS’ independent testing information services. Leading vendors were invited to participate fully at no cost and NSS received no vendor funding to produce this report.
© 2013 NSS Labs, Inc. All rights reserved.
3
NSS Labs
Consumer AV / EPP Comparative Analysis – Phishing Protection
Table of Contents Analysis .................................................................................................................................. 5 The Phishing Threat ...................................................................................................................................... 5 Test Composition – Phishing URLS ............................................................................................................... 6 Total Number Of Malicious URLs In The Test ............................................................................................... 6 Average Number Of Malicious URLs Added Per Day .................................................................................... 6 Mix Of URLs .................................................................................................................................................. 6 Blocking Phishing URLs ................................................................................................................................. 6 Average Time To Block Phishing URLs .......................................................................................................... 7 Average Response Time To Block Phishing .................................................................................................. 8 Real-‐time Blocking of Phishing URLs Over Time ........................................................................................... 9 Conclusion .................................................................................................................................................. 10 Appendix A: Test Environment .............................................................................................. 11 Client Host Description ............................................................................................................................... 11 The Tested Products ................................................................................................................................... 12 Network Description .................................................................................................................................. 12 Appendix B: General Test Procedures ................................................................................... 13 Test Duration .............................................................................................................................................. 13 Test Frequency ........................................................................................................................................... 13 Samples Sets For Phishing URLs ................................................................................................................. 14 Sources ....................................................................................................................................................... 14 Catalog URLs ............................................................................................................................................... 14 Confirm Sample Presence of URLs ............................................................................................................. 14 Archive Active URL Content ....................................................................................................................... 14 Visit Each URL ............................................................................................................................................. 15 Scoring & Recoding the Results .................................................................................................................. 15 Pruning ....................................................................................................................................................... 15 Post-‐Test Validation ................................................................................................................................... 15 Contact Information .............................................................................................................. 17
Table of Figures Figure 1 – Mean Block Rate for Phishing ...................................................................................................................... 2 Figure 2 – Phishing URL Response Histogram – EPP ..................................................................................................... 7 Figure 3 – Phishing URL Response Histogram -‐ Browser ............................................................................................... 7 Figure 4 – Average time to Block – EPP ........................................................................................................................ 8 Figure 5 – Average time to Block – Browser ................................................................................................................. 9 Figure 6 – Phishing Protection Over Time –EPP ............................................................................................................ 9 Figure 7 – Phishing Protection Over Time –Browser ................................................................................................... 10 © 2013 NSS Labs, Inc. All rights reserved.
4
NSS Labs
Consumer AV / EPP Comparative Analysis – Phishing Protection
Analysis Modern endpoint products (EPP), commonly referred to as antivirus suites, provide protection for a vast array of threats. Once offering protection against only viruses, most suites today also claim to provide protection against Trojans, bootkits, phishing, spyware, rootkits, identity theft, exploits and other threats. In this test NSS specifically examines the anti-‐phishing capabilities of 12 popular consumer suites. Historically consumers have relied upon endpoint security products to protect them from anything perceived to be a threat on the Internet. Antivirus companies began to add phishing protection to their offerings with questionable success. The results of this test indicate that phishing protection is not a significant technological focus for most antivirus companies. The top two performers in this test, Trend Micro and Kaspersky, were the only tested products to provide block rates in excess of 70% (Trend Micro 92% and Kaspersky 85%). Kaspersky and Trend Micro account for roughly 9% 2 of the global endpoint security software market according to the September 2012 OPSWAT market share report . This means that more than 90% of the market lacks adequate protection against phishing threats. The results of NSS Browser Phishing Comparative show that the leading browsers are providing 90% to 94% protection, which means that the endpoint products are not generally offering protection on a par with the leading web browsers.
The Phishing Threat "Phishing” attacks can be constructed in two basic ways. The first is an attempt to persuade a victim to provide personal information to the attacker. The information may consist of credit card details, login information for email or social media accounts, or other personal information that can be used for identity theft and other information-‐based attacks. The second type of phishing attack attempts to lure users into installing a malicious application, or navigating to a website where malicious software will be installed through the exploitation of vulnerable software. Common to both phishing attacks is that they can arrive via email, instant messages, SMS messages, and links on social networking sites. Phishing attacks pose a significant risk to individuals and organizations alike, by threatening to compromise or acquire sensitive personal and corporate information. While the number of reported phishing attacks peaked in 2009, these attacks remain high and the actual number of unique phishing sites has increased from 56,362 in August 2009 to a high of 63,253 in April of 2012. In 2012, the average number of unique phishing sites detected has consistently been well over 50,000 per month, up from 2011 where the average number of unique phishing 3 sites per month was well under 40,000. The average uptime for a phishing attack has been steadily falling from a 4 high of 73 hours in the second half of 2010 to a record low of 23 hours and 10 minutes in the first half of 2012. The increased speed at which these threats are “rotated” to new locations is staggering and poses a significant challenge to those attempting to defend against such attacks. In response to this trend, security vendors have developed reputation systems that classify malicious and phishing URLs via in-‐the-‐cloud services. The 2009 NSS Web Browser Group Test stated: 2
http://www.opswat.com/about/media/reports/antivirus-‐september-‐2012 http://www.antiphishing.org/phishReportsArchive.html 4 http://www.antiphishing.org/reports/APWG_GlobalPhishingSurvey_1H2012.pdf 3
© 2013 NSS Labs, Inc. All rights reserved.
5
NSS Labs
Consumer AV / EPP Comparative Analysis – Phishing Protection
“Reputation systems are literally the next “big thing” in computer security and offer an additional layer of protection to client endpoint machines, which have effectively become the mobile corporate perimeter. For home users this was always the case, and now they too can benefit from in the cloud services, usually without even knowing it.” In previous testing Trend Micro demonstrated a highly effective, advanced cloud reputation service for blocking socially engineered malware. The approach has proven effective in phishing protection as evidenced by Trend Micro’s blocking rates that rival the capabilities of the leading web browsers as reported in NSS Browser Security Comparative Analysis: Phishing Protection report.
Test Composition – Phishing URLS Data in this report spans a testing period of 10 days from September 29 through October 8. All testing was performed in the NSS testing facility in Austin, TX. During the course of the test, NSS engineers routinely monitored connectivity to ensure the browsers could access the live Internet sites being tested, as well as their reputation services in the cloud. The emphasis was on fresh live sites, resulting in a larger number of sites being evaluated than were ultimately kept as part of the result set. See the methodology for more details.
Total Number Of Malicious URLs In The Test Throughout the course of this study, 376,247 results were collected from 37 discrete tests conducted without interruption over 240 hours (every 6 hours for 10 days). NSS engineers removed samples that did not pass the validation criteria, including those tainted by exploits (not part of this test). Ultimately a collection of 5,732 unique URLs was available at the time of entry into the test. The URLs were successfully accessed by the endpoint products in at least one run and included in the final set of phishing sites – providing a margin of error of 1.3% with a 95% confidence interval.
Average Number Of Malicious URLs Added Per Day On average, 573 new validated URLs were added to the test set per day. Although on certain days more or less were added as criminal activity levels fluctuated.
Mix Of URLs The mixture of URLs used in the test was representative of the threats on the Internet. Care was taken not to overweight any one domain to represent more than 10% of the test set and a number of sites were pruned after reaching their limit.
Blocking Phishing URLs NSS engineers assessed the EPP product’s ability to block malicious URLs as quickly as they were discovered on the Internet. Testing continued every six hours to determine the length of time it took a vendor to add protection, if they did at all. © 2013 NSS Labs, Inc. All rights reserved.
6
NSS Labs
Consumer AV / EPP Comparative Analysis – Phishing Protection
Average Time To Block Phishing URLs The following histogram shows how long it took EPP products to block a threat once it was introduced into the test cycle. Cumulative protection rates are listed at the time of introduction, the “zero hour”, through the end of the test. Final protection scores for the URL test duration are summarized under the “Total” column. The initial protection from phishing sites ranged greatly between 0.8% (Norman) and 79.7% (Kaspersky). For comparison, the histogram of web browser’s average time to block phishing URLs follows the histogram for EPP products.
Coverage
100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0%
0-hr
1d
2d
3d
4d
5d
6d
7d
Total
Kaspersky
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
McAfee
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
Trend Micro
60.2%
84.6%
87.3%
87.4%
87.4%
87.4%
87.4%
87.4%
87.4%
Norton
40.0%
71.9%
77.8%
78.6%
78.9%
78.9%
78.9%
78.9%
78.9%
AVAST
31.2%
58.8%
62.9%
64.4%
64.4%
64.4%
64.4%
64.4%
64.4%
Total Defense
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
Fsecure
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
Panda
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
AVG
23.3%
61.8%
69.5%
71.1%
71.5%
71.5%
71.5%
71.5%
71.5%
Avira
5.6%
12.8%
15.5%
15.9%
15.9%
15.9%
15.9%
15.9%
15.9%
Norman
0.8%
1.7%
2.0%
2.2%
2.2%
2.2%
2.2%
2.2%
2.2%
Figure 2 – Phishing URL Response Histogram – EPP
Coverage
100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0%
0-hr
1d
2d
3d
4d
5d
6d
7d
Total
Firefox 15
79.2%
86.1%
88.2%
88.6%
88.6%
88.7%
88.7%
88.7%
88.7%
Chrome 21
53.2%
83.5%
86.2%
87.4%
87.4%
87.5%
87.5%
87.5%
87.5%
Internet Explorer 10
55.9%
83.2%
86.3%
86.7%
86.8%
86.8%
86.8%
86.8%
86.8%
Safari 5
76.9%
85.6%
88.4%
89.2%
89.3%
90.9%
91.0%
91.3%
91.4%
Figure 3 – Phishing URL Response Histogram -‐ Browser
© 2013 NSS Labs, Inc. All rights reserved.
7
NSS Labs
Consumer AV / EPP Comparative Analysis – Phishing Protection
Kaspersky demonstrated the strongest protection for zero hour phishing attacks at 79.7% and remained at a constant 79.7% through the duration of the test. Trend Micro started at 60.2%, rising to 84.6% by the end of day one and plateauing at 87.4% by day three. Longer-‐term blocking of phishing sites was generally not up to the level of protection offered by modern web browsers and often was significantly poorer than the block rates of Chrome, Internet Explorer, Firefox and Safari. In comparing the phishing response histograms of EPP products and web browsers it becomes clear that nearly half of the EPP products are not improving detection response times over their zero-‐hour block rates. Only Trend Micro shows a comparable response histogram to the leading browsers.
Average Response Time To Block Phishing The absolute block rate of phishing attacks is only one metric of effectiveness. Because phishing sites generally have a short lifespan (23 hours on average), how long it takes for an endpoint product to add detection is also an important metric. Figure 4 below shows the average time to block a phishing site once it was introduced into the test set, but only if it was blocked during the course of the test. Unblocked sites are not included, as there is no mathematically empirical way to score “never.” Trend Micro and Kaspersky lead the field in both average block rate and average time to block. The correlation of blocking effectiveness and time to block did not hold true across the field. AVG came in third with respect to the average block rate, but was came in ninth in terms of average time to block new URLs. Comparing the response times of the antivirus products to the response times of the browsers, as recently reported in the NSS Browser Phishing Protection test, it is apparent that only Trend Micro and Kaspersky were as fast, or faster, than any of the browsers at adding phishing detection.
Norman Avira AVG Total Defense Fsecure AVAST McAfee Norton Panda Kaspersky Trend Micro
17.71 14.68 11.86 8.84 8.26 8.18 8.11 7.60 6.79 4.36 4.05 0
5
10 Hours
15
20
Figure 4 – Average time to Block – EPP
© 2013 NSS Labs, Inc. All rights reserved.
8
NSS Labs
Consumer AV / EPP Comparative Analysis – Phishing Protection
Internet Explorer 10
6.11 5.64
Chrome 21 Safari 5
5.38 2.35
Firefox 15 0
5
10 Hours
15
20
Figure 5 – Average time to Block – Browser
When comparing the average time to block phishing attacks between browsers and EPP products it becomes clear that over 80% of the time the tested web browsers are significantly faster than EPP products at adding protection from phishing attacks. The mean time to block a site (if it is blocked at all) is 10 hours for EPP products compared to 4.87 hours for web browsers. Trend Micro and Kaspersky were the only products that were, on average, faster at adding protection than most of the browsers. However, neither matched Firefox’s response time or Chrome’s block rate. Note that the protection percentage will deviate from the unique URL results for various reasons. For example, this data includes multiple tests of a URL. If it is blocked early on, it will improve the score. If it continues to be missed, it will detract from the score. Thus, results of individual URL tests were compounded over time.
Real-‐time Blocking of Phishing URLs Over Time
Catch Rate
The metrics for blocking individual URLs represent just one perspective. When it comes to daily usage scenarios, users are visiting a wide range of sites that may change quickly, and at any given time, the available set of phishing URLs is revolving. Therefore, continuing to block these sites is a key metric for effectiveness. NSS tested a set of live URLs every six hours. The graph below shows protection at each of the 37 incremental tests over a period of 10 days. Each score represents protection at a given point in time.
100.00% 90.00% 80.00% 70.00% 60.00% 50.00% 40.00% 30.00% 20.00% 10.00% 0.00%
Trend Micro Kaspersky AVG Norton McAfee Avast Total Defense F-‐Secure
Figure 6 – Phishing Protection Over Time –EPP © 2013 NSS Labs, Inc. All rights reserved.
9
NSS Labs
Consumer AV / EPP Comparative Analysis – Phishing Protection
100% 95%
Catch Rate
90% 85%
Safari 5
80%
Chrome 21
75%
Internet Explorer 10
70%
Firefox 15
65% 60% 55% 50%
Figure 7 – Phishing Protection Over Time –Browser
The charts of phishing protection over time reveal a lack of stability on the part of EPP products in the day to day blocking of phishing attacks, which is concerning. Compounding generally low block rates with highly erratic detection over time means that average block rates can deviate significantly and leave consumers virtually unprotected at any given point in time. One such example is Norton, which tested at 63% for the mean block rate, but which at specific times was below 30%. None of the leading browsers exhibited fluctuations larger than approximately 15% over time.
Conclusion As endpoint protection products have been forced to add more and more functionality it is not surprising that certain functions will be handled more adeptly by different types of products, or by using uniquely different tactics. NSS engineers have determined that the placement of the web browser generally makes it more suitable for phishing protection than endpoint protection products. Trend Micro and Kaspersky have effectively managed to implement phishing protection by leveraging reputation systems. However, because this protection does not appear to add significant value over the protections offered by the web browsers, the absolute impact on phishing protection may be significantly diminished. Consumers who insist upon using outdated browsers may find added value in the phishing protection offered by some EPP products. However, consideration of phishing protection as a product selection criterion should be generally relegated to the status of “tie-‐breaker.” Where users are evaluating multiple products, other features such as protections against exploits and socially engineered malware should be given a higher priority as selection criteria. Vendors of products not scoring well in phishing protection should consider investment in technologies that bolster other protective capabilities. Because of the added value of reputation systems in blocking socially engineered malware, products utilizing this technology may naturally improve their phishing protection rates as a by-‐product of improved socially engineered malware blocking capabilities. Phishing is a purely social engineering attack.
© 2013 NSS Labs, Inc. All rights reserved.
10
NSS Labs
Consumer AV / EPP Comparative Analysis – Phishing Protection
Appendix A: Test Environment NSS has created a complex test environment and methodology to assess the protective capabilities of endpoint products under the most real-‐world conditions possible, while also maintaining control and verification of the procedures. For this endpoint product security test, NSS created a unique “Live Testing™” harness in order to duplicate user experiences under real world conditions. 376,247 individual tests (URL lookups) were performed over a period of 10 days (37 discrete test runs).
Client Host Description All tested browser software was installed on identical virtual machines, with the following specifications: • • •
Microsoft Windows 7 4GB RAM 60GB HD
EPP product systems were tested prior to and during the test to ensure proper functioning. EPP products were given full access to the Internet so they could visit the actual live sites.
© 2013 NSS Labs, Inc. All rights reserved.
11
NSS Labs
Consumer AV / EPP Comparative Analysis – Phishing Protection
The Tested Products The following is a current list of the products that were tested and are listed alphabetically: 1.
AVAST Pro Antivirus 7 7.0.1466
2.
AVG Internet Security 2012 2012.0.2197
3.
Avira Internet Security 2012 12.0.0.1127
4.
F-‐Secure Agent 1.57 Build 191, CUIF 10.01 build 32329, DAAS2 1.10 build 299
5.
Kaspersky Internet Security 2012 12.0.0.374
6.
McAfee Internet Security 11
7.
Microsoft Security Essentials 4.0.1526.0
8.
Norman Security Suite 9.00
9.
Norton Internet Security 19.8.0.14
10. Panda Internet Security 2012 17.01.00 11. Total Defense Internet Security Suite 8.0.0.87 12. Trend Micro Titanium Maximum Security 6.0.1215 All products were downloaded from vendor websites and installed using the default options. Once testing began, the product version was frozen, in order to preserve the integrity of the test. Given the nature of endpoint protection platforms, virus signatures and definition updates as well as HIPS updates were enabled with whatever frequency was set by the manufacturer. This test relied upon Internet access for the reputation systems and access to live content.
Network Description The EPP products were tested for their ability to protect the client in “connected” use cases. NSS tests consider and analyze the effectiveness of EPP Protection in NSS’ real-‐world, Live Testing harness. The host system has one network interface card (NIC) and is connected to the network via a 1Gb switch port. For the purposes of this test, NSS utilized 96 desktop systems each running an EPP product. Eight (8) systems per product with twelve (12) different products were utilized and results were recorded into a MySQL database.
© 2013 NSS Labs, Inc. All rights reserved.
12
NSS Labs
Consumer AV / EPP Comparative Analysis – Phishing Protection
Appendix B: General Test Procedures The purpose of the test is to determine how well the tested EPP products protect users from phishing threats on the Internet today. A key aspect is the timing. Given the rapid rate and aggressiveness with which criminals propagate and manipulate phishing URLs, a key objective is to ensure that the “freshest” sites possible are included in the test. NSS has developed a unique proprietary “Live Testing” harness and methodology. On an ongoing basis, NSS collects web-‐based threats from a variety of sources, including partners and its own servers. Potential threats are vetted algorithmically before being inserted into the test queue. Threats are being inserted and vetted continually 24x7. Note: unique to this procedure is that NSS validates the samples before and after the test. Actual testing of the threats proceeded every two hours and starts with validation of the site’s existence and conformance to the test definition. All tests are executed in a highly controlled manner, and results are meticulously recorded and archived at each interval of the test.
Test Duration This NSS’ EPP product test is performed continuously (24x7) for 10 days. Throughout the duration of the test, new URLs are added as they are discovered.
Test Frequency Over the course of the test, each URL is run through the test harness every six hours. Regardless of success or failure, the “victim machine” in the test harness will attempt to browse to the phishing site for the duration of the test. Collect New Suspicious Malicous Sites from Sources
Pre-Filter, Validate, Prune & Archive Sites
Results Collected & Archived
Test Clients Visit Sites & Record Block/Allow
Distribute to Test Clients
© 2013 NSS Labs, Inc. All rights reserved.
13
NSS Labs
Consumer AV / EPP Comparative Analysis – Phishing Protection
Samples Sets For Phishing URLs Freshness of phishing sites is a key attribute of this type of test. In order to utilize the freshest most representative URLs, NSS receives a broad range of samples from a number of different sources.
Sources NSS operates its own network of spam traps and honeypots. These email accounts with high-‐ volume traffic yield thousands of unique emails and URLs per day. NSS maintains a growing archive of phishing and malicious URLs and other malware. This archive contains multiple gigabytes of confirmed samples. Although only phishing URLs are used in this test, some phishing URLs may also contain malware and exploits. In addition, NSS maintains relationships with other independent security researchers, networks, and security companies that provide access to URLs and malicious content. Sample sets contain phishing URLs distributed via spam, social networks, and other websites. Every effort is made to consider submissions that reflect a real-‐world distribution of phishing, categorically, geographically, and by platform. In addition, NSS maintains a collection of “clean URLs” that includes such sites as Yahoo, Amazon, Microsoft, Google, NSS, major banks, etc. Periodically, clean URLs are run through the system to verify browsers are not over-‐ blocking.
Catalog URLs New sites are added to the URL consideration set as soon as possible following initial discovery. The date and time each sample is introduced is noted. Most sources are automatically and immediately inserted, while some methods require manual handling and can be processed in under 30 minutes. All items in the consideration set are cataloged with a unique NSS ID, regardless of their validity. This enables NSS engineers to track effectiveness of sample sources.
Confirm Sample Presence of URLs Time is of the essence, since the objective is to test the effectiveness against the ‘freshest’ possible phishing sites. Given the nature of the feeds and the rate of change, it is not possible to validate each site in depth before the test, since the site could quickly disappear. However, each of the test items is given an initial review to verify it meets the basic criteria and is accessible on the live Internet. In order to be included in the execution set, URLs must be live during the test iteration. At the beginning of each test iteration, the availability of the URL is confirmed by ensuring that the site can be reached and is active (e.g. a non-‐404 web page is returned.) This validation occurs within minutes of receiving the samples. Note: These classifications are further validated after the test, and URLs are reclassified and/or removed accordingly.
Archive Active URL Content The active URL content is downloaded and saved to an archive server with a unique NSS ID number. This enables NSS to preserve the URL content for control and validation purposes. © 2013 NSS Labs, Inc. All rights reserved.
14
NSS Labs
Consumer AV / EPP Comparative Analysis – Phishing Protection
Visit Each URL A customized client automation utility requests each of the URLs deemed “live” via each of the EPP products in the test. The test harness records whether or not the phishing site is successfully accessed, and if the attempt to visit the site triggers a warning from the EPP product’s phishing protection.
Scoring & Recoding the Results The resulting response is recorded as either “allowed” or “blocked and warned.” • •
Success: NSS defines “success” based upon an EPP successfully preventing access to a phishing URL. Failure: NSS defines a “failure” based upon an EPP failing to block and issue a warning.
Pruning Throughout the test, lab engineers review and prune out non-‐conforming URLs and content from the test execution set. For example, a URL that was classified initially as phishing and that has been replaced by the web host with a generic splash page will be removed from the test. If a URL sample becomes unavailable during the course of the test, the sample is removed from the test collection for that iteration. NSS continually verifies each sample’s presence (availability to access) and adds/removes each sample from the test set accordingly. Should a phishing sample be unavailable for a test iteration and then become available again for a subsequent iteration, it will be added back into the test collection. Unavailable samples are not included in calculations of success or failure by an EPP product.
Post-‐Test Validation Post-‐test validation enables NSS to reclassify and even remove samples which were either not malicious or not available before the test started. NSS performs both automated and manual validation of suspected phishing sites. At the end of this process we had 5,732 URLs that were live and valid at the time of testing and are now part of this report. The targeted brands are in the table below. ABN-‐AMRO Allegro ANZ AT&T BancoItau BankofBrazil Battle.net Bradesco CapitecBank CIBC Co-‐operativeBank DHL Facebook Gmail
ABSA AlliedDirect AOL BancoDEGuayaquil BancoItu BankvanDePost BBVA Cadastro CartaSi Cielo Craigslist DirectGov FirstNiagara Halifax
© 2013 NSS Labs, Inc. All rights reserved.
Alibaba Amazon ARABBank BancoFalabella BancoPopolare BanReservas BMO Caixa CenturyLink Citi DanskeBank Ebay FNB HongLeong
Alipay AmBankGroup ASB BancoG&Tcontinental BancoPopulare Barclays BOA CapitalOne Chase CommerzBank DBA.DK Email GlobalEmail HotMail 15
NSS Labs
Consumer AV / EPP Comparative Analysis – Phishing Protection
HSBC IntelligentFinance LibertyBank MasterCard Nationwide Outlook Postale RBS RuneScape SFR St.George TAM Tesco Vodafone Westpac YorkshireBank
ICBC InterBank LibertyReserve MBNA NatWest Paipai Posteitaliane RCBC Santander Sicredi STEAM Taobao UBA Webmail WHITNEYBank
ICSVisa IRS Lloyds MicrosoftAccount Netflix Paypal Postepay Regions ScotiaBank Skrill SuncorpBank TDcanadaTrust USAA Wellsfargo WindowsSecurity
ING KiwiBank mail NAB Nordea PNC RBC REMAX SecuredAdiminContactFeedBack Skype SunTrust Telstra Visa WesternUnion Yahoo
© 2013 NSS Labs, Inc. All rights reserved.
16
NSS Labs
Consumer AV / EPP Comparative Analysis – Phishing Protection
Contact Information NSS, Inc. 6207 Bee Caves Road, Suite 350 Austin, TX 78746 USA +1 (512) 961-‐5300
[email protected] www.nsslabs.com
This and other related documents available at: www.nsslabs.com. To receive a licensed copy or report misuse, please contact NSS at +1 (512) 961-‐5300 or
[email protected]. © 2013 NSS, Inc. All rights reserved. No part of this publication may be reproduced, photocopied, stored on a retrieval system, or transmitted without the express written consent of the authors.
Please note that access to or use of this report is conditioned on the following:
1. The information in this report is subject to change by NSS without notice.
The information in this report is believed by NSS to be accurate and reliable at the time of publication, but is not guaranteed. 2. All use of and reliance on this report are at the reader’s sole risk. NSS is not liable or responsible for any damages, losses, or expenses arising from any error or omission in this report. 3. NO WARRANTIES, EXPRESS OR IMPLIED ARE GIVEN BY NSS. ALL IMPLIED WARRANTIES, INCLUDING IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NON-‐INFRINGEMENT ARE DISCLAIMED AND EXCLUDED BY NSS. IN NO EVENT SHALL NSS BE LIABLE FOR ANY CONSEQUENTIAL, INCIDENTAL OR INDIRECT DAMAGES, OR FOR ANY LOSS OF PROFIT, REVENUE, DATA, COMPUTER PROGRAMS, OR OTHER ASSETS, EVEN IF ADVISED OF THE POSSIBILITY THEREOF. 4. This report does not constitute an endorsement, recommendation, or guarantee of any of the products (hardware or software) tested or the hardware and software used in testing the products. The testing does not guarantee that there are no errors or defects in the products or that the products will meet the reader’s expectations, requirements, needs, or specifications, or that they will operate without interruption. 5. This report does not imply any endorsement, sponsorship, affiliation, or verification by or with any organizations mentioned in this report. 6. All trademarks, service marks, and trade names used in this report are the trademarks, service marks, and trade names of their respective owners.
© 2013 NSS Labs, Inc. All rights reserved.
17