Small Business Endpoint Protection - SE Labs

Transcription

Small Business Endpoint ProtectionJanuary - March 2016

SE LabsSE LabsCONTENTSIntroduction 04SE Labs tested a range of endpoint security products from a range ofwell-known vendors in an effort to judge which were the most effective.Each product was exposed to the same threats, which were a mixture oftargeted attacks using well-established techniques and public web-basedthreats that were found to be live on the internet at the time of the test.The results indicate how effectively the products were at detecting and/or protecting against those threats in real time.02JanuaryEnterprise- MarchAnti-virus2016 protection Small Business JanuaryEndpoint- MarchProtection2016Executive Summary 051. Total Accuracy Ratings 062. Protection Ratings 083. Protection Scores 104. Protection Details 115. Legitimate Software Ratings 126. Conclusions 16Appendix A: Terms used 17Appendix B: FAQs 18Appendix C: Product versions 19Appendix D: Attack types 19Document version 1. 0. Written 4th April 2016Small Business Endpoint Protection January - March 201603

SE LabsSIMON EDWARDSDirectorWebsite www.SELabs.ukTwitter @SELabsUKEmail info@SELabs.ukFacebook www.facebook.com/selabsukPhone 0203 875 5000Post ONE Croydon, London, CR0 0XTINTRODUCTIONEXECUTIVE SUMMARYEndpoint products are considered by almost everysecurity software vendor to be an essential level ofprotection in a business network. Headlines thatproclaim anti-virus to be dead are usually making atoo-subtle point about signature-reliant technologiesrather than writing off a whole segment of the ITsecurity market. All the products here combinesignature-based protection with other, moreadvanced, technologies.Product namesIdeally an endpoint product will require no management,protect against every threat that it encounters and allowaccess to all non-malicious applications and websitesthat match the organisation’s policy. That’s a pretty tallorder and one that is unlikely to exist, despite variousclaims from newly arrived companies that offeralternatives to ‘anti-virus’.This test shows the results of three months of research,during which time the SE Labs team located live webbased threats that internet users in the real world wereencountering at the time of testing. Crucially we testedstraight away, as soon as each threat was verified, so wecould determine how well the popular anti-malwareendpoints in the lab would perform against currentprevalent malware threats.There is much talk of targeted attacks in the press andstrong claims by some security vendors that antimalware technology is useless against these typesof threats. We decided to test this claim and includeda range of attacks in this test that are close, if notidentical, to how an attacker could attempt tocompromise an endpoint.SE Labs uses current threat intelligence to make ourtests as realistic as possible. To learn more about howwe test, how we define ‘threat intelligence’ and how weuse it to improve our tests, please visit our website andfollow us on Twitter.04January - March 2016 Small Business Endpoint ProtectionIt is good practice to stay up to date with the latest version of yourchosen endpoint security product. We made best efforts to ensure thateach product tested was the very latest version running with the mostrecent updates to give the best possible outcome.For specific build numbers see Appendix C: Product versions on page 19.Products OTALACCURACYSymantec Endpoint Security Small Business Edition99%99%99%Kaspersky Small Office Security89%100%96%Sophos Endpoint Protection75%100%92%Microsoft System Center Endpoint Protection71%100%90%Trend Micro Worry Free Security Services61%95%83%Products highlighted in green were the most accurate, scoring 85 per cent or more for Total Accuracy. Those inyellow scored less than 85 but 75 or more. Products shown in red scored less than 75 per cent. For exact percentagessee 1. Total Accuracy Ratings on page 6. The endpoints were effective at handling generalthreats from cyber criminals All the products were capable of handling publicweb-based threats such as those used by criminalsto attack Windows PCs and install ransomwareautomatically, without having to trick a user intoclicking an install button. but targeted attacks posed more of a challengeWhile two of the products were also very competentat blocking more targeted, exploit-based attacks, theother three were less effective. One product, fromTrend Micro, failed to stop targeted attacks more oftenthan it succeeded. False positives were not an issue for mostproductsAll endpoint solutions were good at correctly classifyinglegitimate applications and websites. Three of the fiveproducts made no mistakes at all. Which products were the most effective?Symantec and Kaspersky Lab products achieved thebest results due to a combination of their ability to blockmalicious URLs, handle exploits and correctly classifylegitimate applications and websites.Simon Edwards, SE Labs, 4th April 2016Small Business Endpoint Protection January - March 201605

SE LabsSE Labs1. TOTAL ACCURACY RATINGSJudging the effectiveness of an endpoint securityproduct is a subtle art, and many factors are at playwhen assessing how well it performs. To make thingseasier we’ve combined all the different results from thisreport into one easy-to-understand graph.prevent it from downloading any further code to thetarget. In another case malware might run on the targetfor a short while before its behaviour is detected and itscode is deleted or moved to a safe ‘quarantine’ area forfuture analysis. We take these outcomes into accountwhen attributing points that form final ratings.The following products win SE Labs awards:SE Lab ymantec Endpoint Security Small Business Edition S aspersky Small Office Security KDSINESS ENSE LabLPOJAN-MAR 2016BU ophos Endpoint Protection S icrosoft System Center Endpoint Protection MSINESS ENDSMALINTsCategorising how a product handles legitimateobjects is complex, and you can find out how we do itin 5. Legitimate Software Ratings on page 12.Total Accuracy RatingsPOJAN-MAR 2016BULNot all protections, or detections for that matter, areequal. A product might completely block a URL, whichprevents the threat completely before it can even startits intended series of malicious events. Alternatively, theproduct might allow a web-based exploit to execute butSMALFor example, a product that completely blocks a threatis rated more highly than one which allows a threat torun for a while before eventually evicting it. Productsthat allow all malware infections, or that block popularlegitimate applications, are penalised heavily.INTsThe graph below takes into account not only eachproduct’s ability to detect and protect against threats,but also its handling of non-malicious objects such asweb addresses (URLs) and applications.Awards12140Total Accuracy Ratings combine protection and false positives.06January - March 2016 Small Business Endpoint ProtectionWorry Free Security ServicesTrend MicroINTPO rend Micro Worry Free Security Services TSINESS ENDSystem Center Endpoint ProtectionMicrosoftEndpoint ProtectionSophosSmall Office SecurityKasperskySymantec303JAN-MAR 2016BUL607SMAL910Endpoint Security Small Business EditionsSE LabTOTAL ACCURACY RATINGSProductTotal AccuracyRatingTotalAccuracy (%)AwardSymantec Endpoint Security Small Business Edition120299%AAAKaspersky Small Office Security116996%AAASophos Endpoint Protection111492%AAMicrosoft System Center Endpoint Protection109690%AATrend Micro Worry Free Security Services101383%BSmall Business Endpoint Protection January - March 201607

SE LabsSE Labs2. PROTECTION RATINGSThe results below indicate how effectively the productsdealt with threats. Points are earned for detecting thethreat and for either blocking or neutralising it. Detected ( 1)If the product detected the threat with any degree ofuseful information, we award it one point. Blocked ( 2)Threats that are disallowed from even starting theirmalicious activities are blocked. Blocking productsscore two points. Neutralised ( 1)Products that kill all running malicious processes‘neutralise’ the threat and win one point. Complete remediation ( 1)If, in addition to neutralising a threat, the productremoves all significant traces of the attack, it gainsan additional one point. Compromised (-5)If the threat compromised the system, the productloses five points. This loss may be reduced to fourpoints if it manages to detect the threat (see Detected,above), as this at least alerts the user, who may nowtake steps to secure the system.Worry Free Security ServicesTrend MicroSystem Center Endpoint ProtectionMicrosoftEndpoint ProtectionSophosSmall Office SecurityKasperskySymantec100Endpoint Security Small Business Edition400200Protection rating (2x number of Blocked) (1x number of Neutralised) (1x number of Complete remediation) (-5x number of Compromised)The ‘Complete remediation’ number relates to cases ofneutralisation in which all significant traces of the attackwere removed from the target. Such traces should notexist if the threat was ‘Blocked’ and so Blocked resultsimply Complete remediation.These ratings are simple and based on our opinionof how important these different outcomes are.You may have a different view on how seriously youtreat a ‘Compromise’ or ‘Neutralisation withoutcomplete remediation’. If you want to create yourown rating system, you can use the raw data from4. Protection Details on page 11 to roll your own set ofpersonalised ratings.Protection Ratings300Rating calculationsWe calculate the protection ratings using thefollowing formula:0Protection Ratings are weighted to show that how products handle threats can be subtler than just“win” or “lose”.PROTECTION RATINGSProductProtection RatingProtection Rating (%)39499%Kaspersky Small Office Security35589%Sophos Endpoint Protection30075%Microsoft System Center Endpoint Protection28271%Trend Micro Worry Free Security Services24361%Symantec Endpoint Security Small Business EditionAverage: 79%08January - March 2016 Small Business Endpoint ProtectionSmall Business Endpoint Protection January - March 201609

SE LabsSE Labs3. PROTECTION SCORES4. PROTECTION DETAILSThis graph shows the overall level of protection,making no distinction between neutralised andblocked incidents.For each product we add Blocked and Neutralisedcases together to make one simple tally.These results break down how each product handledthreats into some detail. You can see how manydetected a threat and the levels of protection provided.protect against. This can happen when they recognisean element of the threat but are not equipped to stop it.Products can also provide protection even if they don’tdetect certain threats. Some threats abort on detectingspecific endpoint protection software.Products sometimes detect more threats than theyProtection DetailsProtection Scores1000DefendedProtection Scores are a simple count of how many times a product protected the system.Trend MicroWorry Free Security ServicesSystem Center Endpoint ProtectionMicrosoftEndpoint ProtectionSophosSmall Office SecurityNeutralisedCompromisedThis data shows in some detail how each product handled the threats used.PROTECTION DETAILSPROTECTION SCORESProductKaspersky0Endpoint Security Small Business EditionTrend Micro25Symantec50Worry Free Security ServicesSystem Center Endpoint ProtectionMicrosoftEndpoint ProtectionSophosSmall Office Security2575Kaspersky50Symantec75Endpoint Security Small Business Edition100Protection ScoreProductDetectedBlocked Neutralised Compromised ProtectedSymantec Endpoint Security Small Business Edition100Symantec Endpoint Security Small Business Edition1009640100Kaspersky Small Office Security95Kaspersky Small Office Security95950595Sophos Endpoint Protection89Sophos Endpoint Protection908811189Microsoft System Center Endpoint Protection89Microsoft System Center Endpoint Protection898521387Trend Micro Worry Free Security Services82Trend Micro Worry Free Security Services81820178210January - March 2016 Small Business Endpoint ProtectionSmall Business Endpoint Protection January - March 201611

SE LabsSE Labs5. LEGITIMATE SOFTWARE RATINGSThese ratings indicate how accurately the productsclassify legitimate applications and URLs, while alsotaking into account the interactions that each producthas with the user. Ideally a product will either not classifya legitimate object or will classify it as safe. In neithercase should it bother the user.5.1 Interaction ratingsWe also take into account the prevalence (popularity) ofthe applications and websites used in this part of thetest, applying stricter penalties for when productsmisclassify very popular software and sites.To understand how we calculate these ratings, see5.3 Accuracy ratings on page 15.Legitimate Software Ratings814It’s crucial that anti-malware endpoint products notonly stop, or at least detect, threats but that they allowlegitimate applications to install and run withoutmisclassifying them as malware. Such an error isknown as a ‘false positive’ (FP).In reality, genuine false positives with applications arequite rare in testing. In our experience it is unusual for acompletely legitimate application to be classified asbeing “malware”. More often it will be classified as“unknown”, “suspicious” or “unwanted” (or terms thatmean much the same thing).We use a subtle system of rating an endpoint’s approachto legitimate objects which takes into account how itWorry Free Security ServicesNone(allowed)Click to allow(default allow)Click to allow/block(no recommendation)Object is safe21.51Object is unknown210.50-0.5BObject is not classified20.50-0.5-1CObject is suspicious0.50-0.5-1-1.5DObject is unwanted0-0.5-1-1.5-2E-2-2F4512Click to block(default block)3Products that do not bother users and classify most applications correctly earn morepoints than those that ask questions and condemn legitimate applications.Legitimate software ratings can indicate how well a vendor has tuned its detection engine.INTERACTION RATINGSLEGITIMATE SOFTWARE RATINGSProductClick to block(default block)None(allowed)Legitimate Accuracy RatingLegitimate Accuracy (%)Kaspersky Small Office Security814100%Kaspersky Small Office Security100Microsoft System Center Endpoint Protection814100%Microsoft System Center Endpoint Protection100Sophos Endpoint Protection814100%Sophos Endpoint Protection100Symantec Endpoint Security Small Business Edition80899%Symantec Endpoint Security Small Business EditionTrend Micro Worry Free Security Services77095%Trend Micro Worry Free Security Services12January - March 2016 Small Business Endpoint ProtectionNone(blocked)AObject is malicious0ProductIf a product allows an application to install and run withno user interaction, or with simply a brief notificationthat the application is likely to be safe, it has achievedan optimum result. Anything else is a Non-OptimalClassification/Action (NOCA). We think that measuringNOCAs is more useful than counting the rarer FPs.InteractionTrend MicroEndpoint Security Small Business EditionSymantecEndpoint ProtectionSophosMicrosoft203.5Kaspersky407System Center Endpoint Protection610.5classifies the application and how it presents thatinformation to the user. Sometimes the endpointsoftware will pass the buck and demand that the userdecides whether or not the application is safe. In suchcases the product may make a recommendation toallow or block, but leave the ultimate decision to theuser. In other cases, the product will make norecommendation, which is possibly even less helpful.1None(blocked)99973Small Business Endpoint Protection January - March 201613

SE LabsSE Labs5.2 Prevalence ratings5.3 Accuracy ratingsThere is a significant difference between an endpointproduct blocking a popular application like the latestversion of Microsoft Word and condemning a rareIranian dating toolbar for Internet Explorer 6. One is verypopular all over the world and its detection as malware(or something less serious but still suspicious) is a bigdeal. Conversely, the outdated toolbar won’t have had acomparably large user base even when it was new.Detecting this application as malware may be wrong, butit is less impactful in the overall scheme of things.With this in mind, we collected applications of varyingpopularity and sorted them into five separate categories,as follows:1. Very high impact2. High impact3. Medium impact4. Low impact5. Very low impactApplications were downloaded and installed during thetest, but third-party download sites were avoided andoriginal developers’ URLs were used where possible.Download sites will sometimes bundle additionalcomponents into applications’ install files, which maycorrectly cause anti-malware products to flag adware.We remove adware from the test set because it is oftenunclear how desirable this type of code is.The prevalence for each application and URL isestimated using metrics such as third-party downloadsites and the date from Alexa.com’s global trafficranking system.We calculate legitimate software accuracy ratings bymultiplying together the interaction and prevalenceratings for each download and installation:5.4 Distribution ofimpact categoriesAccuracy rating Interaction rating x PrevalenceratingEndpoint products that were most accurate in handlinglegitimate objects achieved the highest ratings. If allobjects were of the highest prevalence, the maximumpossible rating would be 1,000 (100 incidents x (2interaction rating x 5 prevalence rating)).If a product allowed one legitimate, Medium impactapplication to install with zero interaction with the userthen its Accuracy rating would be calculated like this:In this test there was a range of applications withdifferent levels of prevalence. The table below showsthe frequency:Accuracy rating 2 x 3 6This same calculation is made for each legitimateapplication/site in the test and the results are summedand used to populate the graph and table shown under5. Legitimate Software Ratings on page 12.LEGITIMATE SOFTWARE CATEGORY FREQUENCYPrevelance RatingFrequencyVery high impact51High impact27Medium impact10Low impact7Very low impact5Grand total100Incorrectly handling any legitimate application willinvoke penalties, but classifying Microsoft Word asbeing malware and blocking it without any way for theuser to override this will bring far greater penaltiesthan doing the same for an ancient niche toolbar. Inorder to calculate these relative penalties, we assignedeach impact category with a rating modifier, as shownin the table below.LEGITIMATE SOFTWARE PREVALENCERATING MODIFIERSImpact categoryRating modifierVery high impact5High impact4Medium impact3Low impact2Very low impact114January - March 2016 Small Business Endpoint ProtectionSmall Business Endpoint Protection January - March 201615

SE LabsSE Labs6. CONCLUSIONSAttacks in this test included infected websites availableto the general public, including sites that automaticallyattack visitors and attempt to infect them without anysocial engineering or other interaction. Some sites reliedon users being fooled into installing the malware. Wealso included targeted attacks, which were exploit-basedattempts to gain remote control of the target systems.Symantec Endpoint Security Small Business Editionwas able to fend off the exploit-based targeted attacksfully, while also blocking most of the public web attacks,some of which were powered by criminals using exploitkits. It neutralised four attacks and handled legitimateapplications and websites nearly without error.Kaspersky Small Office Security pushed away all butone of the public web-based threats entirely but wascompromised by four of our targeted attacks. It wasparticularly effective at stopping threats by blockingwithin the web browser, thus preventing the threat fromstarting its attack. This software was also entirelyeffective when handling legitimate objects.Sophos Endpoint Protection was similar ineffectiveness as the products above when exposed toweb threats, but the targeted attacks were a significantchallenge and it was compromised by 10 of the 25deployed in the test. It was, however, perfect whenhandling legitimate applications and websites.16January - March 2016 Small Business Endpoint ProtectionAPPENDICESMicrosoft System Center Endpoint Protection hadsimilar problems with the targeted attacks, failing toprevent 10 compromises. However, it was strong whenhandling public web threats, and its accurateassessment of the legitimate applications and websitesachieved it a good total accuracy rating.Trend Micro Worry Free Security Services was theworst when tackling the targeted attacks. We wereable to compromise the target with 16 exploit-basedattacks. However, it did well when faced with publicweb-based threats, missing only a couple. It wasn’tperfect when legitimate applications were installed,though, blocking three without giving the user achance to permit the installation.There was no small business product from McAfeein this test because, at the time that testing started,the company had recently announced the end of lifefor its small business Security as a Service (Saas)endpoint product. We plan to include its replacementin the next test.APPENDIX A: TERMS USEDTERMMEANINGCompromisedThe attack succeeded, resulting in malware running unhindered on the target. In thecase of a targeted attack, the attacker was able to take remote control of the systemand carry out a variety of tasks without hindrance.BlockedThe attack was prevented from making any changes to the target.False positiveWhen a security product misclassifies a legitimate application or website as beingmalicious, it generates a ‘false positive’.NeutralisedThe exploit or malware payload ran on the target but was subsequently removed.Complete remediationIf a security product removes all significant traces of an attack it has achievedcomplete remediation.TargetThe test system that is protected by a security product.ThreatA program or sequence of interactions with the target that is designed to take somelevel of unauthorised control of that target.UpdateSecurity vendors provide information to their products in an effort to keep abreast ofthe latest threats. These updates may be downloaded in bulk as one or more files, orrequested individually and live over the internet.The products from Symantec and Kaspersky Lab bothwin AAA awards for their strong overall performance.Those from Sophos and Microsoft achieved solid AAawards, while Trend Micro’s product is awarded a B.Small Business Endpoint Protection January - March 201617

SE LabsSE LabsAPPENDIX B: FAQsA full methodology for this test is available fromour website.The products chosen for this test were selectedby SE Labs.The test was not sponsored. This means that nosecurity vendor has control over the report’scontent or its publication.The test was conducted between 21st January 2016and 18th March 2016.All products had full internet access and wereconfirmed to have access to any required orrecommended back-end systems. This wasconfirmed, where possible, using the Anti-MalwareTesting Standards Organization (AMTSO) CloudLookup Features Setting Check.Malicious URLs and legitimate applications andURLs were independently located and verified bySE Labs.Targeted attacks were selected and verified by SELabs. They were created and managed byMetasploit Framework Edition using defaultsettings. The choice of exploits was advised bypublic information about ongoing attacks. Onenotable source was the 2015 Data BreachInvestigations Report from Verizon.Malicious and legitimate data was providedto partner organisations once the full testwas complete.SE Labs conducted this endpoint security testingon physical PCs, not virtual machines. APPENDIX C: PRODUCT VERSIONSQAI am a security vendor. How can I include myproduct in your test?Please contact us at info@SELabs.uk. We will behappy to arrange a phone call to discuss ourmethodology and the suitability of your productfor inclusion.QAQAI am a security vendor. Does it cost money tohave my product tested?We do not charge directly for testing products inpublic tests. We do charge for private tests.QAQASo you don’t share threat data with testparticipants before the test starts?No, this would bias the test and make the resultsunfair and unrealistic.A product’s update mechanism may upgrade the software toa new version automatically so the version used at the start ofthe test may be different to that used at the end.PRODUCT VERSIONSVendorProductBuildKasperskySmall Office Security15.0.2.361 (d)MicrosoftSystem Center Endpoint Protection4.7.2114.0SophosEndpoint Protection10.3SymantecEndpoint Security Small Business Edition12.1.6Trend MicroWorry Free Security Services5.8.1104/19.1.2558What is a partner organisation? Can I becomeone to gain access to the threat data used inyour tests?Partner organisations support our tests by paying foraccess to test data after each test has completed butbefore publication. Partners can dispute results and useour award logos for marketing purposes. We do notshare data on one partner with other partners. We donot currently partner with organisations that do notengage in our testing.I am a security vendor and you tested my productwithout permission. May I access the threat datato verify that your results are accurate?We are willing to share small subsets of data withnon-partner participants at our discretion. A smalladministration fee is applicable.APPENDIX D: ATTACK TYPESThe table below shows how each product protected againstthe different types of attacks used in the test.ATTACK TYPESProduct18January - March 2016 Small Business Endpoint ProtectionTargeted attackPublic web attackProtected (total)Symantec Endpoint Security Small Business Edition2575100Kaspersky Small Office Security217495Sophos Endpoint Protection157489Microsoft System Center Endpoint Protection157287Trend Micro Worry Free Security Services97382Small Business Endpoint Protection January - March 201619

06 016 s Small Business Endpoint Protection Small Business Endpoint Protection s 01 07 Judging the effectiveness of an endpoint security product is a subtle art, and many factors are at play when assessing how well it performs. To make things easier we've combined all the different results from this report into one easy-to-understand graph.