dslreports logo
 
    All Forums Hot Topics Gallery
spc
Search similar:


uniqs
8666

jbob
Reach Out and Touch Someone
Premium Member
join:2004-04-26
Little Rock, AR
·Comcast XFINITY
Asus GT-AX6000
Asus RT-AC66U B1

1 recommendation

jbob

Premium Member

Anti-Spyware Cleanup Comparison Report....8th round

Comments!

»www.malware-test.com/tes ··· rts.html

Eighth Round (July 18, 2006):

Cleanup Success Rate for Entry-based Viewpoint:

Trend Micro Anti-Spyware: 79.62%
Webroot Spy Sweeper: 61.15%
PC Tools Spyware Doctor: 57.96%
Sunbelt CounterSpy: 55.41%
Norton Internet Security: 52.23%
McAfee antispyware: 45.86%
Computer Associate Anti-Spyware: 41.40%
Panda Platinum Internet Security: 28.66%
ewido anti-malware: 28.66%
Microsoft Windows Defender: 24.84%
Lavasoft Ad-Aware: 14.65%
Spybot S&D: 12.74%
Aluria Anti-Spyware: 2.50%
eburger68
Premium Member
join:2001-04-28

2 edits

1 recommendation

eburger68

Premium Member

jbob:

Thanks for posting the link to these tests, which I had not seen before.

Having spent some time looking over the 8 rounds, I must say these tests are, generally speaking, very well done, though there are problems. At the very least, though, these tests are head-and-shoulders above the tests that one sees so often on the Net, including the published tests from the major computer/tech mags.

Here's what sets these tests apart:

* the methodology of the testing is clearly disclosed

* the test bed and test environment and transparently and thoroughly revealed and documented

* test results are reported exhaustively

* testing has been performed against a wide variety of adware, spyware, and malware, not just a narrow range of samples

* testing is performed against "badware" in its actual environment, as opposed to simply running a scanner over a static collection of samples

* detection and remediation are both tested, not just detection

* files, Registry keys, and processes are tracked, not just files

* it appears that some selection of items or traces (files, Reg keys) was performed to isolate the most critical components to test and track (as opposed to tracking every last garbage data file and Reg change)

* testing is performed with only reputable anti-malware applications that users could actually be urged to consider purchasing

Users familiar with the anti-spyware testing that I did back in late 2004 will notice a lot of apparent similarities between these tests and my own:

»spywarewarrior.com/asw-t ··· uide.htm

I'd like to think these folks took a cue from my own testing, but they might have developed this methodology and reporting framework all on their own for all I know.

I do have some quibbles with these tests, mainly on the selection of "badware" to include in the various test beds. The first two test beds are excellent, using a good range of adware and spyware.

Tests three and four, though are next to bizarre, as they focus exclusively on "rogue" anti-spyware apps, all of which are documented on Spyware Warrior (which is listed as a "source" for these tests):

»www.spywarewarrior.com/r ··· ware.htm

Now, even though I might classify all of the included apps as "rogue," I wouldn't necessarily urge reputable anti-malware vendors to target each and every single one of them, as many of the apps on the "rogue" list are simply piss-poor anti-spyware applications, not malicious or imposing in any way (like the SpyAxe, SpyFalcon, and the like). Thus, I'm not sure it's reasonable to expect anti-spyware apps to target many of the "rogue" apps included in rounds 3 and 4, and vendors could be forgiven for simply not including them in their definitions. As a result, the meaningfulness of these two rounds is dubious.

I have similar problems with rounds 5-8, which rely a bit too heavily in my judgement on low risk adware apps, adware bundlers, and keyloggers/system snoopers. There are legitimate differences of opinion as to whether particular low risk adware apps and adware bundlers ought to be targeted by anti-spyware apps. And commercial keyloggers, while nasty and dangerous, simply don't have the kind of prevalence that mainstream adware and spyware apps do. Again, in light of the questionable choice of apps for the test bed, the meaningfulness of rounds 5-8 is not as great as it could have been.

Nonetheless, these are fine tests, and I hope to see more from this organization. I would urge readers to take a careful look at these tests, flawed though they are, as they are a far sight better than what usually passes for anti-spyware testing these days.

Best,

Eric L. Howes

Snowy
Lock him up!!!
Premium Member
join:2003-04-05
Kailua, HI

1 recommendation

Snowy

Premium Member

said by eburger68:

Having spent some time looking over the 8 rounds, I must say these tests are, generally speaking, very well done, though there are problems. At the very least, though, these tests are head-and-shoulders above the tests that one sees so often on the Net, including the published tests from the major computer/tech mags.
Aloha,
I generally view these types of tests with extreme pessimism, if at all, but your post reply got me looking at this one.
I'll rely on your opinion that the testers were sensitive to the issue of transparency in their testing methodology as fact, but there are areas outside the test itself that need to be considered. It must be my cynicism from seeing one too many bogus test results being touted as definitive fact but their on-line scan that "can scan viruses, trojan, worm, backdoor, spyware, adware, keylogger, rootkit etc." is using what scanning engine? Personally I'm not comfortable with a company offering scanning services too also be an objective tester of other scanning services.
»59.124.80.246/
eburger68
Premium Member
join:2001-04-28

1 recommendation

eburger68 to jbob

Premium Member

to jbob
Hi All:

One other thing that could be improved is the placement of the test reports, which are buried in the site forum. Here's a set of links to the topics for the various test rounds -- each topic has an attached PDF report for that round of testing:

Eighth Round (July 18, 2006)
»malware-test.com/smf/ind ··· c=1497.0

Seventh Round (June 24, 2006)
»malware-test.com/smf/ind ··· c=1261.0

Sixth Round (June 1, 2006)
»malware-test.com/smf/ind ··· ic=955.0

Fifth Round (May 3, 2006)
»malware-test.com/smf/ind ··· ic=573.0

Fourth Round (April 3, 2006)
»malware-test.com/smf/ind ··· ic=572.0

Third Round (March 26, 2006)
»malware-test.com/smf/ind ··· ic=571.0

Second Round (Feb 22, 2006)
»malware-test.com/smf/ind ··· ic=570.0

First Round (Feb 22, 2006)
»malware-test.com/smf/ind ··· ic=569.0

Best,

Eric L. Howes

psicop
More human than human
Premium Member
join:2005-12-21
Australia

psicop to jbob

Premium Member

to jbob
Sure free tools have less cleanup capacity than paid ones.

Why is that?

TK421
Premium Member
join:2004-12-19
Canada

TK421 to jbob

Premium Member

to jbob
It appears MS Defender has fallen substantially since earlier product testing I recall seeing when Anti-Spyware Beta 1 was originally released. As mentioned, this sort of comparison testing often should not be taken too seriously although I thought the Giant/MS-Beta1 was generally rated better than Defender now is in recent reports I have looked at. Is it that Defender has gotten worse or the competition is improving? Sunbelt CounterSpy began on more or less even ground with MS-Beta1 yet is far ahead of Microsoft's product in recent tests.