jbob:
Thanks for posting the link to these tests, which I had not seen before.
Having spent some time looking over the 8 rounds, I must say these tests are, generally speaking, very well done, though there are problems. At the very least, though, these tests are head-and-shoulders above the tests that one sees so often on the Net, including the published tests from the major computer/tech mags.
Here's what sets these tests apart:
* the methodology of the testing is clearly disclosed
* the test bed and test environment and transparently and thoroughly revealed and documented
* test results are reported exhaustively
* testing has been performed against a wide variety of adware, spyware, and malware, not just a narrow range of samples
* testing is performed against "badware" in its actual environment, as opposed to simply running a scanner over a static collection of samples
* detection and remediation are both tested, not just detection
* files, Registry keys, and processes are tracked, not just files
* it appears that some selection of items or traces (files, Reg keys) was performed to isolate the most critical components to test and track (as opposed to tracking every last garbage data file and Reg change)
* testing is performed with only reputable anti-malware applications that users could actually be urged to consider purchasing
Users familiar with the anti-spyware testing that I did back in late 2004 will notice a lot of apparent similarities between these tests and my own:
»
spywarewarrior.com/asw-t ··· uide.htmI'd like to think these folks took a cue from my own testing, but they might have developed this methodology and reporting framework all on their own for all I know.
I do have some quibbles with these tests, mainly on the selection of "badware" to include in the various test beds. The first two test beds are excellent, using a good range of adware and spyware.
Tests three and four, though are next to bizarre, as they focus exclusively on "rogue" anti-spyware apps, all of which are documented on Spyware Warrior (which is listed as a "source" for these tests):
»
www.spywarewarrior.com/r ··· ware.htmNow, even though I might classify all of the included apps as "rogue," I wouldn't necessarily urge reputable anti-malware vendors to target each and every single one of them, as many of the apps on the "rogue" list are simply piss-poor anti-spyware applications, not malicious or imposing in any way (like the SpyAxe, SpyFalcon, and the like). Thus, I'm not sure it's reasonable to expect anti-spyware apps to target many of the "rogue" apps included in rounds 3 and 4, and vendors could be forgiven for simply not including them in their definitions. As a result, the meaningfulness of these two rounds is dubious.
I have similar problems with rounds 5-8, which rely a bit too heavily in my judgement on low risk adware apps, adware bundlers, and keyloggers/system snoopers. There are legitimate differences of opinion as to whether particular low risk adware apps and adware bundlers ought to be targeted by anti-spyware apps. And commercial keyloggers, while nasty and dangerous, simply don't have the kind of prevalence that mainstream adware and spyware apps do. Again, in light of the questionable choice of apps for the test bed, the meaningfulness of rounds 5-8 is not as great as it could have been.
Nonetheless, these are fine tests, and I hope to see more from this organization. I would urge readers to take a careful look at these tests, flawed though they are, as they are a far sight better than what usually passes for anti-spyware testing these days.
Best,
Eric L. Howes