Monday, October 26, 2015

CompTia Survey - 17% of people would put a found USB stick in their laptop. Ouch or fantastic?

In a  CompTia survey  written about by Softpedia in “One of the Biggest Security Risks: Naive People Connecting Lost USBs to Their PCs”,   an interesting statistic came up.  As part of the study, 200 USB sticks were left in high traffic locations in US cities.  20% (forty) were picked up and 17% were connected to people’s laptops.  According to the article, The USB sticks used in the experiment contained a text file, which included instructions asking the user to send an email to a specific address, or to click through a trackable URL.

The reporter found the 17% figure worrisome.  I’ll take a contrarian view.

At RSA San Francisco 2013, we conducted a security survey, gathering 300 responses.  78% of those responding said that they had once found a USB and plugged it into their laptop!   68% of those surveyed had been involved in a security breach, either at home, or in their office.

While 17% is a frighteningly high number, that is a 61%% drop from what I found just two and a half years earlier!

A found USB stick is an internet equivalent of coming across  a “Wet Paint” sign. You just have to check it out yourself. We are our own worst enemies. More training is need. 

For an interesting read on the use of infected USB sticks for good, Google and read about Stuxnet, a 500-kilobyte computer worm that infected the software of at least 14 industrial sites in Iran, including a uranium-enrichment plant. 

Sunday, October 18, 2015

The Pareto Principle and the Pursuit of Perfect Internet Security – a Parable

Not so long ago, a bright security professional and a firm believer of the Pareto Principle, was tasked with designing and implementing an impregnable security solution for his company’s internet. He did his research and arrived at what he thought was an accurate total cost of $4M. Just prior to striding into his manager’s office for approval, he had a quick discussion about the project with a recent new hire reporting to him about the project.

“I’d be careful,” she advised. “At my last company, we found that each major phase cost 50% more than the previous phase. We had several discussions about ‘risk profiles’ and ‘perfect protection’ before getting buy-in on deliverables and budget on a less ambitious result.” 

The bright security professional thanked her and said, “I’m quite confident in my projections and will stake my job on this project. In fact, I will bring it in under budget.”

So, the bright security professional met with his somewhat parsimonious manager, and guaranteed the results. “In fact,” he said, “the first phase of the project will get us 80% there for only $800k."  The manager said, “Fine, but go over budget on this and your next position will have you saying, ‘Would you prefer a grande or a venti latte?’” and with that, the project was approved.

At the completion of the project, how much under budget was the confident security professional?

First, the Pareto Principle is named after economist Vilfredo Pareto (1848-1923), From Investopedia, “The principle states that, for many phenomena, 20% of invested input is responsible for 80% of the results obtained. Put another way, 80% of consequences stem from 20% of the causes. Also referred to as the "80/20 rule".”

The answer is – the individual left to “pursue other opportunities” when he found himself having exhausted the budget, told his manager  that he now felt that 100% was unobtainable and that  it would cost an additional $2.5M to get to 97.5% protection.

How did this happen?

Earlier, a factor (chosen by me) added by the wise new hire was that each phase of the project was that each phase of the project was going to cost 50% more than the previous phase.

Phase 1 - $800k spent (total $800K) to reach 80% of perfection

Phase 2 - $1.2M spend (total $2M) to reach 90% of perfection

Phase 3 - $1.8M spent (total $3.8M) to reach 95% of perfection

Phase 4 – Plug pulled on project. The estimate was $2.7M (total $6.5M) to reach 97.5% of perfection and you never reach 100%

Some morals of this parable

·         100% is tough, if not impossible, to achieve

·         Know your risk profile and your company’s risk profile when working on security projects

·         Know how to make coffee drinks

Thursday, October 15, 2015

AV-Comparatives File Detection Test – September 2015

Av-Comparatives prolific team of writers and testers has released their File Detection Test – September 2015. Nine products received three stars. Avira and BitDefender topped the 21 products in the test.   Their false positive rate was only 0.2%. Other companies receiving three stars, in alphabetical order, were Bullguard, Emisoft, eScan, ESET, Kaspersky, Lavasoft, and Panda.  You can download the report  to see the actual order.

ESET, Microsoft, and Panda had zero false positives The hall of shame award for this test goes to AVG Technologies with a false positive rate 32 times larger Avira and Bitdefender, at 6.5%, (139 false positives).

About the AV-Comparatives  File Detection Test

The awards for the File Detection Test were based on a combination of detection rates and false positives.   The File Detection Test assesses the ability of antivirus programs to detect malicious files on a system. It can identify malware attacks from sources other than the Internet, and it  can identify  malicious files already present on the system.

 “With more than 130000 samples in the test, AV-Comparatives uses one of the largest sample collection worldwide to provide statistically valid results”, according to AV-Comparatives’ Andreas Clementi.

ABC Award for the  File Detection Test

The ABC award (Avoids Being Compared) goes to Symantec. The File Detection Test  is one of the core tests the organization performs. Companies cannot choose which of these core tests to be in. It's all or none.  The ABC award is not part of AV-Comparatives’ test   program!

The document can be downloaded at:    

The  file detection rate of a product is only one aspect of a complete anti-virus product. AV-Comparatives also provides a whole-product dynamic “real-world” protection test, as well as other test reports that cover different aspects/features of the products.  For those interested, you can easily do a deep dive into individual company’s historical performances on tests or sign up for the newsletter.   Check them out.  Other documents are available for download from the AV-comparatives website  ( ) website.

Thursday, October 08, 2015

Av-Comparatives – Review of IT Security Suites for Small Business – September 2015

Av-Comparatives has released their Review of IT Security Suites for Small Business   - September 2015.  The review   examines security suites suitable for a company running either the Foundation or the Enterprise edition of Microsoft Windows Server 2012 R2. The Foundation version is suitable for small companies with up to 15 users (from the Microsoft website), while the Essentials version allows an additional ten users. The report considers products for a network of up to 25 client PCs, with one file server/domain controller.

AV-Comparatives’ review covered only the essential everyday tasks needed in all networks. However some products have additional features and could be used for significantly bigger networks reviewed. Products in the Review of IT Security Suites are:
Bitdefender Endpoint GravityZone, ESET Remote Administrator, F-Secure Protection Service For Business, G Data Antivirus Business, Kaspersky Small Office Security, McAfee SaaS Endpoint Protection, Sophos Endpoint Security and Control Cloud, Symantec Endpoint Protection, and Trend Micro Worry Free Business Security Services.  Symantec! They’re here.  They are not present on many of AV-Comparatives’ reviews (companies cannot selectively opt out of a subset of core reviews; it’s all or none).
The document itself runs around 90 pages.  Each product is given a comprehensive overview.  Major categories that AV-Comparatives looked at include:
Supported OS, Documentation, Management Console (cloud based, server based, and virtual appliance) Respective endpoint protection programs for Windows and Mac OS clients, Window Server Protection Software, and Summary

All of the products received the AV-Comparatives’ Approved Business Award.
The advantages of a document like this include, the depth of comparison, the same features/functionality are looked at for each product, and the review was done by a known test organization. A company would not have the time (and for a Small Business, the expertise) to go into this depth for nine products.  Companies looking to replace their current product being used should find this report a valuable (at no charge!) resource.

For those who like to compare products on a feature grid, suffice it to say that AV-Comparatives provides a sizeable (Multiple fingers and toes! Approximately 100 rows) grid as part of the document. This document is more than adequate for you to select one product for your environment or select a short list for evaluation.
The document can be downloaded at:    
The “Death of Antivirus Software is Greatly Exaggerated”, as written in an article in CSO Online (and others).  You still need protection from these threats, whether the protection is provided from software on the device or from the cloud. Greatly Exaggerated

 Av-Comparatives has a fantastic library of test documents. The site organization scores high on surveys.  Check them out.  Other documents are available for download from the AV-comparatives website ( ) website.



Thursday, October 01, 2015

AV-Comparatives Malware Removal Test – September 2015

AV-Comparatives has released the results of their AV-Comparatives Malware Removal Test for 2015. Products tested were a combination of free and paid solutions.  Sixteen products were tested. Five received three stars or the Advanced Plus award. Kaspersky topped the list. BitDefender was third and the three “A’s”, Avast, AVG Technologies and Avira, rounded out the three star recipients.

AV-Comparatives Malware Removal Test

The Malware Removal Test focused only on the malware removal/cleaning capabilities of the products. The report was written with home users in mind and not administrators or advanced users.  These individuals  may have the knowledge and tools for removal of malware on the system.  To compare products for their protection and detection capabilities, you may want to download AV-Comparatives “Real World Protection Test” and “File Detection Test”.

The ABC or “Avoids Being Compared” Award

More data and testing by an unbiased test group help   consumers make an informed decision when selecting products to secure their devices.  The number of likes on a product’s web site doesn’t cut it for security when licensing  a product.  Comparative testing also motivates companies to improve their products.  It’s disappointing when companies decline to be tested.

For the AV-Comparatives Malware Removal Test, the ABC Award or “Avoids Being Compared” Award goes to Symantec, McAfee, and Trend Micro. All three of these companies have solutions with sizeable share in the antivirus/internet security consumer marketplace.  Perhaps they will step up for the next test. McAfee and Trend Micro are usually there. Symantec? Not so much.

The Malware Removal Test  document is located at

All of AV-Comparatives’ tests can be found at