The alternative headline was going to be “Who Pimped My Internet Security Product Testing?”, but common sense won out.
In 2002, then New York State Attorney General Eliott Spitzer won a case against Network Associates’ McAfee subsidiary regarding a “censorship clause” in some of the company’s Eula’s (End User Licensing Agreements). The clause stated that customers could not publish product reviews or results of benchmark tests without permission from the company.
New York State Supreme Court Justice Marilyn Shafer issued a ruling, prohibiting Network Associates/McAfee from trying to use end-user license agreements to ban product reviews or benchmark tests. http://news.cnet.com/2100-1023-981228.html
Eliott Spitzer became governor of New York State 2007. He resigned in 2008 when his name became affiliated with an investigation being done on a high end escort service. It’d be a cheap shot to mention the v word here! Spitzer will be joining with Pulitzer Prize winning newspaper columnist Kathleen Parker on CNN this fall. Quite a trick on his part. From high end call girls to cable.
If an antivirus and internet security provider is willing to collect revenue from customers for its product, they should be willing to have the product tested, benchmarked, and/or reviewed without making the test organization jump through a lot of hoops or “hinting” that something may happen if the test group vary from what they say on the form. If a vendor is willing to allow customers to download its product for free, the same holds true.
One security vendor’s website and their EULA for testing and benchmarking states in part - “You agree that the testing/benchmarking results will only be used as specified by you in this form and for no other purpose whatsoever. …reserves the right to use its sole discretion in denying your request as a whole or in part.” The EULA requires information about the method and purpose of testing, among other details.
It’s all related to that First Amendment thing on the west side of the pond. Freedom of speech and freedom of the press. Vendors cannot be expected to be able to examine complete test plans in advance. Results could originally be posted on a web site, then appear in a print article, be written about in blogs, etc. This doesn’t waive the requirement that the testers utilize best practices. Vendors should not consider it their right to review results in their entirely before publication and then back out if they don’t like the test results or text of the article. Test organizations and reviewers should be willing to allow vendors to vet feature check lists and pricing if these are part of the article.
Vendors should be able to exert more influence over a test being performed when a product is in beta. If a new version of the product is going to be released before the article is published, discussions would have to take place. These issues with dates cannot be helped sometimes, due to release schedules, and publication deadlines. For example, it may be unfair to the vendor (and the customer), if a group review is published for example and an older version of the product is compared with competitors’ current releases. They can’t use the phrase “we’re in beta” ad nausea, though. When results are online, publications/reviewers should make the effort to footnote the article, should a new version be released.
Friday, June 25, 2010
Thursday, June 24, 2010
Virus Bulletin's Latest Reactive and Proactive (RAP)Test Results
Virus Bulletin has published their latest Reactive and Proactive (RAP) test results.
The RAP test measures products' detection rates across four distinct sets of malware samples. The first three test sets comprise malware first seen in each of the three weeks prior to product submission. These measure how quickly product developers and labs react to new malware emerging every day across the world.
A fourth test set consists of malware samples first seen in the week after product submission. This test set gauges products' ability to detect new and unknown samples proactively, using heuristic and generic techniques.
The relative performance of vendors can best be viewed by looking at the RAP Averages Quadrant (December 2009 through June 2010) chart at http://www.virusbtn.com/vb100/rap-index.xml.
Products/Companies with Reactive detection greater than 90% going from lower to higher on the y axis - AVG Technologies, Avira Free, Avira Pro, Kaspersky, ESET, Check Point, Coranti, and GDATA. The third company of los free amigos, AVAST, just missed 90%, it appears.
Products/Companies scoring over > 70% on the proactive portion going from lower to higher on the x axis were -Kaspersky, Ikarus, ESET, GDATA, Trustport, Coranti, and Check Point.
A full description of the RAP testing methodology and explanation of how to interpret the results can be read at http://www.virusbtn.com/vb100/vb200902-RAP-tests
Virus Bulletin is perhaps best known for their VB100 Awards - The basic requirements for this award are that a product detects, both on demand and on access, in its default settings, all malware known to be 'In the Wild' at the time of the review. The product should generate no false positives when scanning a set of clean files. A list of vendors passing/failing the test is available on the Virus Bulletin site http://www.virusbtn.com/vb100/archive/results?display=summary. Viewing some (not all) of Virus Bulletin’s materials requires a free registration (well worth it). Full details require a paid subscription to the magazine (well worth it).
From a marketing/PR perspective, some vendors take it as a point of pride the number of consecutive times they’ve received a VB100 award. From an evaluation perspective for customers, most recent successes (perhaps 2 or 3 years) in the tests is the most useful. To view any particular companys’ history with VB100 testing, go to http://www.virusbtn.com/vb100/archive/results?display=summary
UK based Virus Bulletin started in 1989 They provide PC users with a regular source of intelligence about computer viruses, their prevention, detection and removal, and how to recover programs and data following an attack. VB’s website is at www.virusbtn.com . The site is a great source of information on malware and spam. They are a member of the Anti-Malware Testing Standards Organization, www.AMTSO.org.
The RAP test measures products' detection rates across four distinct sets of malware samples. The first three test sets comprise malware first seen in each of the three weeks prior to product submission. These measure how quickly product developers and labs react to new malware emerging every day across the world.
A fourth test set consists of malware samples first seen in the week after product submission. This test set gauges products' ability to detect new and unknown samples proactively, using heuristic and generic techniques.
The relative performance of vendors can best be viewed by looking at the RAP Averages Quadrant (December 2009 through June 2010) chart at http://www.virusbtn.com/vb100/rap-index.xml.
Products/Companies with Reactive detection greater than 90% going from lower to higher on the y axis - AVG Technologies, Avira Free, Avira Pro, Kaspersky, ESET, Check Point, Coranti, and GDATA. The third company of los free amigos, AVAST, just missed 90%, it appears.
Products/Companies scoring over > 70% on the proactive portion going from lower to higher on the x axis were -Kaspersky, Ikarus, ESET, GDATA, Trustport, Coranti, and Check Point.
A full description of the RAP testing methodology and explanation of how to interpret the results can be read at http://www.virusbtn.com/vb100/vb200902-RAP-tests
Virus Bulletin is perhaps best known for their VB100 Awards - The basic requirements for this award are that a product detects, both on demand and on access, in its default settings, all malware known to be 'In the Wild' at the time of the review. The product should generate no false positives when scanning a set of clean files. A list of vendors passing/failing the test is available on the Virus Bulletin site http://www.virusbtn.com/vb100/archive/results?display=summary. Viewing some (not all) of Virus Bulletin’s materials requires a free registration (well worth it). Full details require a paid subscription to the magazine (well worth it).
From a marketing/PR perspective, some vendors take it as a point of pride the number of consecutive times they’ve received a VB100 award. From an evaluation perspective for customers, most recent successes (perhaps 2 or 3 years) in the tests is the most useful. To view any particular companys’ history with VB100 testing, go to http://www.virusbtn.com/vb100/archive/results?display=summary
UK based Virus Bulletin started in 1989 They provide PC users with a regular source of intelligence about computer viruses, their prevention, detection and removal, and how to recover programs and data following an attack. VB’s website is at www.virusbtn.com . The site is a great source of information on malware and spam. They are a member of the Anti-Malware Testing Standards Organization, www.AMTSO.org.
Labels:
AMTSO,
avg technologies,
ESET,
malware,
virus
Monday, June 21, 2010
AVG LinkScanner for Mac Announced by AVG Technologies - 6/24 addendum
On June 14th, AVG announced AVG LinkScanner for Mac. Like AVG LinkScanner for Windows, it is a free download, designed to protect people in real time from malicious threats as they surf the web. Most of AVG’s security solutions also include AVG LinkScanner for Windows.
*****
June 24th addendum - http://download.cnet.com/mac/security-software/ CNET's top Mac security downloads for the week ending June 20th.
*****
AVG promotes the LinkScanner for Mac as providing real time protection while surfing the web. This should make it attractive to Mac users, even though the world has yet to see a massive virus outbreak among Mac users! AVG CTO Karel Obluk has an informative blog about how LinkScanner works at http://obluk.blog.avg.com/2009/10/index.html. The AVG LinkScanner demo on the LinkScanner download site inaccurately states that LinkScanner “looks at every single website.” That would take awhile!
There is no migration path to a traditional Antivirus or Internet Security suite solution for the Mac. AVG currently doesn’t offer these. Trend Micro, Symantec, McAfee, among others have Mac Antivirus and/or Internet Security solutions. A Mac offering may be in AVG’s future. However, neither Avira nor Avast (the other two of los Free Amigos) offer a Mac solution, either. Those desiring a free Mac AV solution can check out PC Tools iAntiVirus at http://www.iantivirus.com/
AVG LinkScanner for Mac had an impressive number of downloads during its first week of availability. Below are the number of downloads for the product and for other standalones that promote safe surfing. The below is for the week of June 13 and lists downloads, total downloads, and from which date. These figures are from download.cnet.com and reflect downloads of a particular version. It’ll be interesting to see how weekly downloads change over the next month.
AVG LinkScanner for Mac 2094/2094, 6/13/2010
AVG LinkScanner for Windows 504/73k, 11/04/2009
McAfee SiteAdvisor 660/25k, 12/23/2009 (free version)
Web of Trust (IE) 286/45k, 3/07/2010
Web of Trust (Firefox) 301/65k, 5/05/2010
Finjan, now part of M86 Security – offers Finjan SecureBrowsing™ as a free download (IE and Firefox) http://securebrowsing.finjan.com/. They promote this as scanning web pages in real-time, much like AVG LinkScanner.
Solutions that promote “real-time” and then mention accessing a database may be playing fast and free with the phrase “real-time”. People may want to watch for this.
*****
June 24th addendum - http://download.cnet.com/mac/security-software/ CNET's top Mac security downloads for the week ending June 20th.
*****
AVG promotes the LinkScanner for Mac as providing real time protection while surfing the web. This should make it attractive to Mac users, even though the world has yet to see a massive virus outbreak among Mac users! AVG CTO Karel Obluk has an informative blog about how LinkScanner works at http://obluk.blog.avg.com/2009/10/index.html. The AVG LinkScanner demo on the LinkScanner download site inaccurately states that LinkScanner “looks at every single website.” That would take awhile!
There is no migration path to a traditional Antivirus or Internet Security suite solution for the Mac. AVG currently doesn’t offer these. Trend Micro, Symantec, McAfee, among others have Mac Antivirus and/or Internet Security solutions. A Mac offering may be in AVG’s future. However, neither Avira nor Avast (the other two of los Free Amigos) offer a Mac solution, either. Those desiring a free Mac AV solution can check out PC Tools iAntiVirus at http://www.iantivirus.com/
AVG LinkScanner for Mac had an impressive number of downloads during its first week of availability. Below are the number of downloads for the product and for other standalones that promote safe surfing. The below is for the week of June 13 and lists downloads, total downloads, and from which date. These figures are from download.cnet.com and reflect downloads of a particular version. It’ll be interesting to see how weekly downloads change over the next month.
AVG LinkScanner for Mac 2094/2094, 6/13/2010
AVG LinkScanner for Windows 504/73k, 11/04/2009
McAfee SiteAdvisor 660/25k, 12/23/2009 (free version)
Web of Trust (IE) 286/45k, 3/07/2010
Web of Trust (Firefox) 301/65k, 5/05/2010
Finjan, now part of M86 Security – offers Finjan SecureBrowsing™ as a free download (IE and Firefox) http://securebrowsing.finjan.com/. They promote this as scanning web pages in real-time, much like AVG LinkScanner.
Solutions that promote “real-time” and then mention accessing a database may be playing fast and free with the phrase “real-time”. People may want to watch for this.
Labels:
antivirus,
avast,
avg technologies,
avira,
linkscanner
Wednesday, June 16, 2010
AMTSO Releases Additional Documents on Malware Security Test Design and Testing
The Anti-Malware Testing Standards Organization (www.amtso.org) recently released a pair of documents to assist the anti-malware testing and review community in test design and testing. Members of the AMTSO include both malware/internet security test groups as well as internet security providers such as Symantec, Panda, McAfee, and los free amigos, Avast, Avira, and AVG Technologies. The AMTSO currently has approximately 40 members.
These documents provide great guidelines (in a relatively brief format), for test organizations, publications doing testing, and individuals doing malware/internet security testing and reviews. Individuals and companies who use product reviews and round-ups to make a purchase decision can also benefit from these.
The “Performance Testing Guidelines” document is designed is to provide an overview of the issues involved in the accurate testing security technologies in terms of speed and resource usage.
Some of the closes to twenty factors for measurement discussed in the document include:
• File access time
• Memory usage
• CPU usage
• Network overhead
AMTSO stresses the need to run tests multiple times. Benchmarking a factor just once is inadequate. An average for multiple runs will minimizes the impact of anomalies and provides more accurate results.
The “Whole Product Testing” document discusses factors involved in designing and performing a complete security product test. This is versus isolating components of the product and performing a “Sum of the Parts” testing. They favor whole products tests, pointing out that product capabilities often work together to stop a given threat. This interaction cannot be shown through sum of the parts testing.
Factors they write about that need to be considered in designing and performing a test include:
• Stating the test purpose
• Selecting Samples
• Setting up Tests and Products
• Introducing Samples
• Handling User Interaction
• Capturing Test results
• Interpreting Test Results
While AMTSO designed these documents to assist the test community, the overall beneficiaries are home and business security users.
These documents provide great guidelines (in a relatively brief format), for test organizations, publications doing testing, and individuals doing malware/internet security testing and reviews. Individuals and companies who use product reviews and round-ups to make a purchase decision can also benefit from these.
The “Performance Testing Guidelines” document is designed is to provide an overview of the issues involved in the accurate testing security technologies in terms of speed and resource usage.
Some of the closes to twenty factors for measurement discussed in the document include:
• File access time
• Memory usage
• CPU usage
• Network overhead
AMTSO stresses the need to run tests multiple times. Benchmarking a factor just once is inadequate. An average for multiple runs will minimizes the impact of anomalies and provides more accurate results.
The “Whole Product Testing” document discusses factors involved in designing and performing a complete security product test. This is versus isolating components of the product and performing a “Sum of the Parts” testing. They favor whole products tests, pointing out that product capabilities often work together to stop a given threat. This interaction cannot be shown through sum of the parts testing.
Factors they write about that need to be considered in designing and performing a test include:
• Stating the test purpose
• Selecting Samples
• Setting up Tests and Products
• Introducing Samples
• Handling User Interaction
• Capturing Test results
• Interpreting Test Results
While AMTSO designed these documents to assist the test community, the overall beneficiaries are home and business security users.
Labels:
AMTSO,
anti-virus,
antivirus,
avast,
avg technologies,
avira,
free amigos,
mcafee,
panda,
symantec
Friday, June 11, 2010
AVG Technologies Acquires Walling Data Systems
On June 10, security company AVG Technologies announced the acquisition of Walling Data Systems, one of their leading channel partners. This is probably a great move on their part.
• It immediately strengthens their internal North America sales capabilities.
• It immediately strengthens their internal North America support capabilities.
Business/Education/Non-profits pay for their software. Increased revenue is a good thing, and as companies grow, they purchase more licenses! Many of AVG Technologies’ 110 million customers (number on their web site), use AVG Anti-Virus Free Edition. Some portion of these home users migrate to a solution they pay for. As a private company, the breakdown between business and non-business isn't readily available.
The www.antivirus.com website (Walling Data Systems) states that they are now “AVG for Education, Government, and non-profit”. You can’t tell from this whether they’ll be going after various business verticals outside these. Probably, yes. Government/education/non-profit typically requires or obtain a pricing differential. AVG already has a tiered licensing structure in place.
On the support side - home (paying) users will still have to rely on Email support, FAQ’s, and AVG’s Knowledge Base. They have option to pay for premium services, delivered through support.com.
When developing technologies, companies perform a make/buy decision. AVG’s CEO J.R. Smith has discussed this in a blog on AVG’s web site. Inorganic growth is sometimes the best decision. The same holds true for expanding your sales capabilities. It’s sometimes tactically and strategically more expeditious to purchase the channel expertise you need.
http://www.avg.com/us-en/press-releases-news.ndi-230728 to see the press release
• It immediately strengthens their internal North America sales capabilities.
• It immediately strengthens their internal North America support capabilities.
Business/Education/Non-profits pay for their software. Increased revenue is a good thing, and as companies grow, they purchase more licenses! Many of AVG Technologies’ 110 million customers (number on their web site), use AVG Anti-Virus Free Edition. Some portion of these home users migrate to a solution they pay for. As a private company, the breakdown between business and non-business isn't readily available.
The www.antivirus.com website (Walling Data Systems) states that they are now “AVG for Education, Government, and non-profit”. You can’t tell from this whether they’ll be going after various business verticals outside these. Probably, yes. Government/education/non-profit typically requires or obtain a pricing differential. AVG already has a tiered licensing structure in place.
On the support side - home (paying) users will still have to rely on Email support, FAQ’s, and AVG’s Knowledge Base. They have option to pay for premium services, delivered through support.com.
When developing technologies, companies perform a make/buy decision. AVG’s CEO J.R. Smith has discussed this in a blog on AVG’s web site. Inorganic growth is sometimes the best decision. The same holds true for expanding your sales capabilities. It’s sometimes tactically and strategically more expeditious to purchase the channel expertise you need.
http://www.avg.com/us-en/press-releases-news.ndi-230728 to see the press release
Labels:
anti-virus,
antivirus,
avg technologies,
walling data systems
Thursday, June 10, 2010
AV Comparatives’ Latest Test Results For Anti-Virus Software (Proactive/Retrospective Test)
In May, testing organization AV Comparatives published their latest Proactive/Retrospective test report. This was for the on-demand detection of viruses/malware. This is one of several different tests they perform throughout the year.
Companies receiving “Advanced+ Certification” for their products – Trustport, G DATA, Kaspersky, Microsoft, AVIRA, ESET NOD32, F-Secure, BitDefender,and eScan.
For the proactive detection of new malware, the top ratings were achieved by
1. Trustport, Panda – 63%
2. G Data – 61%
3. Kaspersky, Microsoft – 59%
4. AVIRA – 53%
5. ESET, F-Secure- 52%
The order of the rest of the vendors in the above test – BitDefender, K7, ESET, Symantec (Norton), McAfee, AVG Technologies, Sophos, Avast, Norman, Trend Micro, PC Tools, and Kingsoft
All of the twenty anti-virus products tested were “paid” products with the exception of Avast! Free Anti-Virus 5.0. There was a mix of AV products for the home and business markets (some would say that having a mix like this doesn’t constitute “best practices, and that home should be compared against home, biz to biz, and free to free). Internet Security suites were not tested. The approximately 28k samples in the test were a mix of worms, backdoors, Trojans, and other malware and viruses.
The May (and November) retrospective tests evaluate products against new and unknown malware to measure the proactive detection capabilities (e.g. through heuristics, generic signatures, etc.). False positive rates are also factored into the evaluation. Products also use additional protection features like behavior blockers to protect against completely new/unknown malware.
AV Comparative’s test evaluates only the heuristic/generic detection of the products against unknown/new malware without the need to execute it.
People should go to the AV Comparatives website and download the complete report. http://www.av-comparatives.org/images/stories/test/ondret/avc_report26.pdf .
AV Comparatives recommends that “you visit the vendor’s site and evaluate their software by downloading a trial version, as there are also many other features (e.g. firewall, HIPS, behavior blockers, etc.) and important things (e.g. price, graphical user interface, compatibility, etc.) for an Anti-Virus that you should evaluate by yourself. Even if quite important, the data provided in the test reports on this site are just some aspects that you should consider when buying Anti-Virus software.”
Note that there are other testing organizations as well. Individuals and companies evaluating AV and Internet Security SW should also look the others, at reviews, round-ups, and group tests performed by reputable testing organizations, and technology publications. Note carefully exactly what AV Comparatives tested.
Undoubtedly, companies will put their own spin on the test results.
Companies receiving “Advanced+ Certification” for their products – Trustport, G DATA, Kaspersky, Microsoft, AVIRA, ESET NOD32, F-Secure, BitDefender,and eScan.
For the proactive detection of new malware, the top ratings were achieved by
1. Trustport, Panda – 63%
2. G Data – 61%
3. Kaspersky, Microsoft – 59%
4. AVIRA – 53%
5. ESET, F-Secure- 52%
The order of the rest of the vendors in the above test – BitDefender, K7, ESET, Symantec (Norton), McAfee, AVG Technologies, Sophos, Avast, Norman, Trend Micro, PC Tools, and Kingsoft
All of the twenty anti-virus products tested were “paid” products with the exception of Avast! Free Anti-Virus 5.0. There was a mix of AV products for the home and business markets (some would say that having a mix like this doesn’t constitute “best practices, and that home should be compared against home, biz to biz, and free to free). Internet Security suites were not tested. The approximately 28k samples in the test were a mix of worms, backdoors, Trojans, and other malware and viruses.
The May (and November) retrospective tests evaluate products against new and unknown malware to measure the proactive detection capabilities (e.g. through heuristics, generic signatures, etc.). False positive rates are also factored into the evaluation. Products also use additional protection features like behavior blockers to protect against completely new/unknown malware.
AV Comparative’s test evaluates only the heuristic/generic detection of the products against unknown/new malware without the need to execute it.
People should go to the AV Comparatives website and download the complete report. http://www.av-comparatives.org/images/stories/test/ondret/avc_report26.pdf .
AV Comparatives recommends that “you visit the vendor’s site and evaluate their software by downloading a trial version, as there are also many other features (e.g. firewall, HIPS, behavior blockers, etc.) and important things (e.g. price, graphical user interface, compatibility, etc.) for an Anti-Virus that you should evaluate by yourself. Even if quite important, the data provided in the test reports on this site are just some aspects that you should consider when buying Anti-Virus software.”
Note that there are other testing organizations as well. Individuals and companies evaluating AV and Internet Security SW should also look the others, at reviews, round-ups, and group tests performed by reputable testing organizations, and technology publications. Note carefully exactly what AV Comparatives tested.
Undoubtedly, companies will put their own spin on the test results.
Labels:
anti-virus,
antivirus,
panda,
security,
trend micro
Wednesday, June 09, 2010
Power of the Panda - PC Cloud Anti-Virus 1.1 Receives PC Magazine Editor’s Choice Award
Panda Security’s free AV solution, PC Cloud Anti-Virus 1.1, received the PC Magazine Editor’s Choice Award in a review posted June 9. Neil Rubenking gave the product 4 stars (out of 5). The product is free for non-commercial use.
Neil Rubenking gave the product a generally positive review, finding the product “great” at keeping malware from installing on clean computers. He also liked the small download and simple interface.
One con he mentioned was that the product didn’t thoroughly remove what it did detect.
The Free (or is it Three?) “A”migos, Avira, Avast, and AVG Technologies, have some nice competition looking at them from the clouds.
Go to http://www.pcmag.com/article2/0,2817,2364848,00.asp for the complete review.
Neil Rubenking gave the product a generally positive review, finding the product “great” at keeping malware from installing on clean computers. He also liked the small download and simple interface.
One con he mentioned was that the product didn’t thoroughly remove what it did detect.
The Free (or is it Three?) “A”migos, Avira, Avast, and AVG Technologies, have some nice competition looking at them from the clouds.
Go to http://www.pcmag.com/article2/0,2817,2364848,00.asp for the complete review.
Monday, June 07, 2010
Security Reviews, Round-ups, and Relevance: Standards & Improved Reviews
Security vendors, from the 3 A’s (Alwil, Avira, AVG Technologies) to the big three in the US (Trend Micro, McAfee, and Symantec) prefer to have their security products perform well in reviews and round-ups. They provide 3rd party validation for their solutions. Security reviews and round-ups, along with tests from valid testing organizations, allow individuals and companies to make informed decisions. The nature of security risks are changing. The AMTSO recognizes this.
It’s worthwhile for reviewers and purchasers, both business and consumers to spend some time on websites such as the Anti-Malware Testing Standards Organization (www.amtso.org) About 40 security vendors and test organizations are members of this organization.
The group’s charter focuses on (from their home page)
1.Providing a forum for discussions related to the testing of anti-malware and related products.
2.Developing and publicizing objective standards and best practices for testing of anti-malware and related products.
3.Promoting education and awareness of issues related to the testing of anti-malware and related products.
4.Providing tools and resources to aid standards-based testing methodologies.
5.Providing analysis and review of current and future testing of anti-malware and related products.
Participants in AMTSO are not trying to shut down or discourage testing. They are trying to raise the standards of testing. They don’t certify any organization’s test. They encourage AMTSO members and others to publicly reference conformity to the guidelines they’ve been developing. Everyone benefits from this.
The AMTSO has a library of documents related to testing, standards, sampling, statistical validity, etc. It’s worthwhile for even the casual blogger or reviewer to look at some of these for guidelines. In particular, they should look at a 5 pager on “The Fundamental Principles of Testing”, http://www.amtso.org/documents.html . Reviewing documents that discuss sampling/sample sizes would also be valuable. All the documents are available to those who agree to the license terms.
It’s worthwhile for reviewers and purchasers, both business and consumers to spend some time on websites such as the Anti-Malware Testing Standards Organization (www.amtso.org) About 40 security vendors and test organizations are members of this organization.
The group’s charter focuses on (from their home page)
1.Providing a forum for discussions related to the testing of anti-malware and related products.
2.Developing and publicizing objective standards and best practices for testing of anti-malware and related products.
3.Promoting education and awareness of issues related to the testing of anti-malware and related products.
4.Providing tools and resources to aid standards-based testing methodologies.
5.Providing analysis and review of current and future testing of anti-malware and related products.
Participants in AMTSO are not trying to shut down or discourage testing. They are trying to raise the standards of testing. They don’t certify any organization’s test. They encourage AMTSO members and others to publicly reference conformity to the guidelines they’ve been developing. Everyone benefits from this.
The AMTSO has a library of documents related to testing, standards, sampling, statistical validity, etc. It’s worthwhile for even the casual blogger or reviewer to look at some of these for guidelines. In particular, they should look at a 5 pager on “The Fundamental Principles of Testing”, http://www.amtso.org/documents.html . Reviewing documents that discuss sampling/sample sizes would also be valuable. All the documents are available to those who agree to the license terms.
Thursday, June 03, 2010
Phishing or Phermentation – Santa Cruz Mountains Vintners Festival
On the weekends of June 5,6 and June 12,13, security employees working out of Symantec, McAfee, or Trend Micro corporate headquarters can forget about malware, spyware, and root kits for a couple of weekends. They have the opportunity to enjoy the Santa Cruz Mountains Winegrowers Association Vintners Festival. Approximately 50 wineries participate, with wineries open for tasting on the east side of the mountains the first weekend and on the west side the second. Some wineries group together in local restaurants for this event. Do cybercriminals go wine tasting? The world wonders. Something for amtso.org to discuss at their next meeting.
Wine tasting, barrel tasting, art, appetizers, music, sun, and more! This area is its own distinct appellation. A number of the wineries are open only during these two weekends and during Passport Weekends. http://www.scmwa.com/passport/wineries.htm
See the link below for more information. Tickets are $30 in advance or can be purchased at participating locations.
http://www.scmwa.com/VintnersFestival_000.htm
Avast or AVG Technologies employees can always visit the Moravia or Bohemia regions of the Czech Republic this weekend to sample wine. Or AVG employees can wait until September when Brno has Slavnosti Vina. Should someone meet a potential true love at the festival – Lednice is for lovers!
Wine tasting, barrel tasting, art, appetizers, music, sun, and more! This area is its own distinct appellation. A number of the wineries are open only during these two weekends and during Passport Weekends. http://www.scmwa.com/passport/wineries.htm
See the link below for more information. Tickets are $30 in advance or can be purchased at participating locations.
http://www.scmwa.com/VintnersFestival_000.htm
Avast or AVG Technologies employees can always visit the Moravia or Bohemia regions of the Czech Republic this weekend to sample wine. Or AVG employees can wait until September when Brno has Slavnosti Vina. Should someone meet a potential true love at the festival – Lednice is for lovers!
Subscribe to:
Posts (Atom)