Wednesday, November 14, 2012

October 2012 Virus Bulletin VB100 Awards and RAP Averages Quadrant (Reactive and Proactive)

Virus Bulletin has released their October 2012 VB100 testing and published their RAP Averages Quadrant for the April through October timeframe.  Neither test results were too exciting.  No companies going irate and withdrawing from further testing, in all likelihood. 

The VB100 testing was done on Windows Server 2003.  Only 30 products were in the test.  One third of the products failed.  There were no “major” vendors failing.  Emisoft has failed three out of the last four VB100 tests they have been in.  This can’t be fun for their marketing department.

There were three clusters of vendors who scored over 95% in Reactive Detection and 80% in proactive detection.  It’s not even worth eyeballing to rank within the clusters.  These vendors are (from first to third)

  1. Zeobit, Coranti (clear winners)
  2. Lavasoft, TrustPort, G Data 
  3. Fortinet, Avira Free, Avira Pro, Roboscan, BitDefender, BullGuard, Emisoft, eScan

There was definitely some movement from the previous test. Congratulations,  in particular, to those in “1” and “2” above!

The Top 10 in the February through August test

  1. Coranti (a clear first among the top 10)
  2. Huari (a clear second among the top 10)
  3. Tencent
  4. Lavasoft
  5. BitDefender
  6. G Data
  7. Avira Pro
  8. TrustPort
  9. Emisoft
  10. Avira Free

Hall of Shame awards in the latest test for scoring below 70% on Reactive Detection and below 65% of Proactive Detection (from best to worst), Commtouch, Frisk, Iolo, Total Defense Business, and UnThreat.  No fun for these companies

Tests like these provide  useful information in evaluating the relative strengths of the products.  It obviously wins out over the wisdom of Facebook fans clicking on like!  You can view the  RAP Averages Quadrant chart at

Subscribers to Virus Bulletin's publications have access to more details on the results.

RAP Averages Quadrant

This test measures products' detection rates across four distinct sets of malware samples.  The first three test sets comprise malware first seen in each of the three weeks prior to product submission.  These measure how quickly product developers and labs react to the steady flood of new malware emerging every day across the world.  A fourth test set consists of malware samples first seen in the week after product submission.

VB100 Test Methodology

The purpose of the VB100 comparative is to provide insight into the relative performance of the solutions taking part in the tests, covering as wide a range of areas as possible within the limitations of time and available resources.  More details are available at

UK based Virus Bulletin started in 1989.  They provide PC users with a regular source of intelligence about computer viruses, their prevention, detection, and removal, and how to recover programs and data following an attack.  The Virus Bulletin website is at

No comments: