AV-Comparatives released Dynamic Real World test Aug-Nov results

YeOldeStonecat

Well-Known Member
Reaction score
6,688
Location
Englewood Florida
Detection rates, in order:
Kaspersky
Panda
BitDefender
F-Secure
Trend Micro
Avira
Eset
Emsisoft
Avast
McAfee
Sophos
Fortinet
eScan
GData
Tencent
Vipre
BullGUard
AVG
Qihoo
AhnLab
Kindsoft
Microsoft
 
Because they didn't buy the 'Premium' version of the database :p

/jk

Probably their heuristics engine?
 
Detection rates, in order:
Kaspersky
Panda
BitDefender
F-Secure
Trend Micro
Avira
Eset
Emsisoft
Avast
McAfee
Sophos
Fortinet
eScan
GData
Tencent
Vipre
BullGUard
AVG
Qihoo
AhnLab
Kindsoft
Microsoft

Panda, at number 2, just behind K is a bit of a jaw-dropper.

Would not have predicted that.

Microsoft, in last place seems to be consistent in its loserdom.
 
Okay... so how come stuff that uses another product's engine doesn't get the same score? E.g. Bullguard / Bitdefender?

because they flagging files for other reasons other than definitions. Which is perfectly fine for the average user. Dynamic real world test is a great test....for the average user. Kaspersky got 0 false positives in that test.

Now lets be serious here anyone of us that has installed kaspersky on a machine with your tools and scripts knows good and well that kaspersky is not going to have 0 false positives....might have 0 false positions on popular downloads from download websites but kaspersky is a mean picky s.o.b. that i cant stand to have on a machine that im running scripts on.
 
They're inconsistent though, if you look at previous months they jump in and out of bottom 50%.

Exactly. You don't want to switch products every couple of testing periods. I look for consistency. Kaspersky, Bitdefender and F-Secure have been the most consistently high-scoring products for quite some time.
 
Exactly. You don't want to switch products every couple of testing periods. I look for consistency. Kaspersky, Bitdefender and F-Secure have been the most consistently high-scoring products for quite some time.

Don't forget ESET as well.

I used to sell F-Secure, but they screwed up a couple of builds, and I was really put off by that. By 'screwed up,' I mean that I installed it, and the machine would not boot. Even had an F-Secure guy next to me, as he wanted to see what was going on, and he confirmed the PC build was according to Microsoft standards (Dell system, using a restore disc, then install F-Secure). They quickly pulled both builds of F-Secure.

Kaspersky is good for the end user. I personally don't like it so much because as previously stated, it really doesn't like some of the tools we use. And in my opinion, it also takes an age to update.

No problems with BitDefender.

I like ESET for the reseller portal, as well as the consistency of the product. Get reminders of licenses coming up for renewal, and they don't poach customers.

Andy
 
I like the f-secure company because of mikko and his passion for viruses. When I get a 0 day I contact them first. However I agree with you f-secures applications...not so good. I don't use them either. Bitdefenders fine as long as you wait a few weeks before jumping on major updates.

ESET I use to like until the good heuristics wasnt enough to keep high enough to the top.
 
Don't forget ESET as well.

Oh, I didn't forget ESET. I just didn't mention it. Over the past two years, ESET has consistently been middle-of-the-pack in the Real World Detection tests (see for yourself). While the Kaspersky, F-Secure and Bitdefender jockey for positions in the upper levels nearly every month, and occasionally get bumped by the likes of Trend Micro and Panda, ESET is almost never in the running for top tier.

On the upside, its test results do seem to be on upward swing for the past couple months, so maybe the trend of mediocrity is reversing.
 
I do not agree. These tests throw another antivirus company into the limelight based on the selection stats of the viruses under test. In other words... Let's just say for arguments sake, that you have an antivirus that detects 99% of the 2 million computer viruses out there... So let's say they put together a couple of thousand viruses to test against... say.. 8,000 viruses, and run the test. well 1% of 2 million viruses is 20,000 viruses... so it is possible that your antivirus that detects 99% of all viruses (which would be amazing), could score a complete zero in the test, if the 8,000 viruses are among the 20,000 it does not detect. Oh sure the virus testing centers all claim a 'sampling' of viruses are used from all, to ensure even rounded testing, but it's amazing to me that one virus test shows xyz antivirus to be the very best, and another test rates it near the bottom. I think it's all a crock of celery. Remember, you don't need 20,000 viruses to wreck a computer... It only takes one.
 
I think it's all a crock of celery.

The "test" you described in no way resembles AV-Comparatives' Real World Protection testing procedure, which involves testing with currently available, live-on-the-net viruses. I'd be interested in any criticism you have for their actual procedure (as described in their latest report), but criticizing an imaginary procedure is not useful.

I'm curious, though: if you don't accept independent third-party testing of such products as valid, how do you evaluate the effectiveness of antivirus software?
 
He's got a point in that the actual live viruses you're likely to get exposed to are a tiny proportion of the 1000's that AVs can detect and that some tests include - it's usually the zero day and new ones that are the problem. However AV's like Kaspersky also do the best the zero day tests too so this doesn't change much.
 
I do not agree. These tests throw another antivirus company into the limelight based on the selection stats of the viruses under test. In other words... Let's just say for arguments sake, that you have an antivirus that detects 99% of the 2 million computer viruses out there... So let's say they put together a couple of thousand viruses to test against... say.. 8,000 viruses, and run the test. well 1% of 2 million viruses is 20,000 viruses... so it is possible that your antivirus that detects 99% of all viruses (which would be amazing), could score a complete zero in the test, if the 8,000 viruses are among the 20,000 it does not detect. Oh sure the virus testing centers all claim a 'sampling' of viruses are used from all, to ensure even rounded testing, but it's amazing to me that one virus test shows xyz antivirus to be the very best, and another test rates it near the bottom. I think it's all a crock of celery. Remember, you don't need 20,000 viruses to wreck a computer... It only takes one.

After about 20 years in the industry....I've never found a better, more neutral, unbiased antivirus comparison site that actually replicates real end user use of the computer and how they get infected, and uses current relevant malware. Read up on what AV-Comparatives actually does for their dynamics/real world tests. There simply is no other official testing site which comes even remotely close to being as "real" as they are.
 
Back
Top