![]() Only Symantec achieved the optimum in this area. A perfect result would be if the security package simply did not report anything during all these test activities. In doing so, each solution was required to visit 500 clean websites, scan nearly 350,000 programs free of malicious code and monitor nearly 40 installations of normal applications. In the test section on usability, AV-TEST evaluated this precise proneness towards false positives. What's more, it is unsettling for employees when the system is crying wolf. If the security suites on the clients trigger lots of false positives, this can run the company administrator ragged. Some points were subtracted for this – the solutions thus achieved only 3.5 and 4 points respectively out of 6. The packages from Kaspersky and from McAfee placed the highest burden on the system. ![]() The point scores awarded were in the higher group at 4.5 to 5.5 out of a possible 6 points. F-Secure and Symantec demand somewhat more resources for their good scanning results. Out of the top-performing group, the product from Bitdefender exhibited the lowest system load despite maximum detection. Good detection rates also meant a heavier system load for some test candidates. The same tasks were repeated with the system watchdogs installed, and the times were compared. For the comparison, a reference PC was clocked for various tasks such as loading websites, downloading software, installing applications and copying data. The testers evaluated how heavily the installed security solution slowed down a client in its daily routines. The detection rate for the reference set was somewhat better at 80 percent, but not good. That means that 3 out of 10 threats slipped through. The Microsoft solution, as a comparison, was able to detect only 71 percent of the malware samples in the real-world test. Also experiencing minimal problems in the real-world test were the packages from McAfee and Sophos. The solutions from G Data, Kaspersky Lab and Trend Micro did detect all the malware 100 percent in the AV-TEST reference set, but only 99 percent in the real-world test. The products from Bitdefender, F-Secure and Symantec each completed both test phases with 100 percent detection a stellar result. In the second phase, the security packages had to detect and defend against the AV-TEST reference set with some 18,000 assorted viruses and threats. These latest malware samples were collected from websites and e-mail attachments in the four-week period prior to test launch. First they were supposed to detect over 160 brand-new malware specimens in the real-world test. 3 packages offering 100 percent defenseĮach product was required to complete the malware detection test in two parts. The basic protection from Microsoft, included in the test, only managed to score 10.5 points. Symantec reached 16.5 points, followed by the rest of the products with 15 to 16 points. But at 17 points each, the solutions from Bitdefender and Trend Micro came awfully close to maximum. No product was able to achieve a perfect score. In each test area, up to 6 points were awarded, thus for a maximum 18 points. All the solutions each tackled the same tests once in September and October: They had to demonstrate how well they detect malware, how heavy a system load they place on client PCs in the company network and whether they generate many false positives.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |