Wikipedia:Featured article review/Microsoft Security Essentials/archive1


 * The following is an archived discussion of a featured article review. Please do not modify it. Further comments should be made on the article's talk page or at Wikipedia talk:Featured article review. No further edits should be made to this page.

The article was kept by Dana boomer 00:20, 23 October 2012.

Microsoft Security Essentials

 * Notified: Codename Lisa, WikiProject Microsoft WikiProject Microsoft Windows WikiProject Computer Security WikiProject Computing WikiProject Software

Consider the following quotes from reviews of this product. None of these are very obscure, in fact they are all from the very first page of Google's results for "Microsft Security Essentials review". I could have gone on a lot further but stopped there in the interests of brevity:


 * http://www.pcadvisor.co.uk/reviews/security/3376831/microsoft-security-essentials-4-review/
 * Perhaps because the ‘file count' is so high, Microsoft Security Essentials 4 took 2 hours 42 minutes to scan them, well over twice as long as any other product we've reviewed.
 * Perhaps more worrying is that Microsoft Security Essentials 4 drops off to 2.5/6.0 in the Protection category.
 * ...it was a long way behind in zero-day detection, scoring 76 and 80 percent in consecutive months, against an average of 88 percent.
 * All this means that top protection from brand new malware will be better provided by one of the other free AV applications.
 * ..its speed and, more especially, its protection against zero-day attacks, mean it's not the best.


 * http://www.pcadvisor.co.uk/reviews/security/3354903/microsoft-security-essentials-21-review/
 * We liked Microsoft Security Essentials 2.1's interface, but other free antivirus software does a better job at blocking malware.
 * ...it falters at stopping new malware, and it plods through its chores.
 * It fully blocked 71.4 percent of new malware in our real-world tests, slightly worse than average. In our zoo test, it detected 97.0 percent of known malware samples. With that result, it lags the competition - some packages detected over 99.9 percent of samples.
 * In scan speeds it also fell behind the pack: Its on-demand scanner completed our virtual obstacle course in a worse-than-average 3 minutes, 56 seconds. The on-access scanner was poky too, clocking in at 6 minutes, 43 seconds.
 * Although Microsoft Security Essentials 2.1 has some good qualities, you would be better served by looking at some other options.


 * http://www.pcmag.com/article2/0,2817,2403986,00.asp
 * Unusually slow scan. Failed to run on one test system. Low detection rate in malware cleanup test. Failed to thoroughly clean up threats it did detect.
 * ...in testing its cleanup of already-infested systems wasn't thorough. You can get better protection for the same price.
 * On the malware-infested systems where Microsoft Security Essentials installed and ran correctly, a full scan took hours. Scanning my standard clean test system took 72 minutes, about twice the average. And despite these lengthy scans, the cleanup wasn't very thorough.
 * Microsoft Security Essentials detected 63 percent of the threats, lower than any product tested with the current or previous set of malware samples.
 * 40 percent detection of rootkit samples is also a new low.


 * http://www.pcmag.com/article2/0,2817,2376220,00.asp
 * Protection weaker under Windows XP. Mediocre results in hands-on malware blocking and malware removal tests. Left some threats running after alleged removal.
 * Other free products offer better protection.
 * In the Windows XP test, though, Microsoft took just 11.5 points, not enough for certification. It score high for usability but low for protection and repair.
 * A couple of threats were still running after it reported successful removal, which isn't good.
 * I also test each product's ability to detect and remove commercial keyloggers. Microsoft detected just 50 percent of these and left two of the detected samples running after alleged removal, for a score of 3.2 points.
 * Microsoft Security Essentials detected 89 percent of the rootkit samples. That's good, but more than half of the recent antivirus products detected 100 percent.
 * Microsoft Security Essentials' scores in my malware removal tests were barely average. Many other products, including free products, have shown much greater success.


 * http://www.expertreviews.co.uk/software/1288459/microsoft-security-essentials-2-1
 * It's simple to use, but the protection isn't as good as from other free anti-virus software
 * Sadly, performance isn't so good. In our test run of 50 viruses, Microsoft Security Essentials allowed our PC to be compromised four times (eight per cent), protecting against 92 per cent of test viruses.
 * Unfortunately, despite its ease of use, Security Essentials is rather limited.

Of course, I've been selective here to show, but in no way have I pulled out every negative quote. Indeed I had to restrain myself a couple of times since I was in danger of quoting a substantial chunk of the respective articles. However, I think I've cited enough to show that the reviews of MSE are very much a mixed bag.

Reading the article however, you gain little insight to this, especially in context. Some detection figures are quoted but with little indication of how they compare to its competitors. At FAC I noted that reviews appeared to have been cherry-picked, and sources had been misquoted or misrepresented (in one case a review of a released version was dismissed as "only a beta"). As such I do not consider that this article can be considered balanced and therefore its FA status should be revoked on neutrality grounds. Quantumsilverfish (talk) 01:29, 22 October 2012 (UTC)

Quantumsilverfish (talk) 01:29, 22 October 2012 (UTC)
 * Keep - neutrality concerns are unfounded. Criticism is given exactly it's DUE weight.--Jasper Deng (talk) 01:38, 22 October 2012 (UTC)


 * Delist I agree with Quantumsilverfish above. There is a serious imbalance here, the sources are misrepresented, and even the lead has NPOV problems. &bull; Jesse V.(talk) 01:48, 22 October 2012 (UTC)


 * I do see problems here. The article has certainly improved in this respect since it was listed but I still wouldn't call it NPOV.  Quatumsilverfish does seem to be playing devil's advocate here but to be fair I can respect that: somebody needs to since there is certainly a case to be made.  He/she presented legitimate concerns and these were at best not adequately explored and at worst dismissed because of the very bias alleged.


 * The complaints about misquoting or misrepresenting sources are minor in isolation and do not affect the current revision as far as I can see in any case, but when they all point in the same direction (towards more favourable coverage) that is enough to raise an eyebrow over concerns of systemic bias.


 * The crux of the issue as presented is does the article reflect the whole range of expert opinion in a balanced manner? After a bit of independent research and considering the above in aggregate I suggest that whether intentionally or not, the balance is firmly shifted towards the positive end of the spectrum.  I would say the main pros cited are that it's free, it's from Microsoft, and it has little impact on system performance.  The main cons are poor detection rates, even worse removal rates, and very slow scanning speed.  The article does not present those kind of tradeoffs.


 * Concerns have been raised on the article talked page but essentially brushed aside. There has been a certain amount of hostility shown: this edit in particular gives me cause for concern, as even to a certain extent Jasper Deng's dismissive comments above.


 * I also note that Codename Lisa (the nominator at FAC) and Jasper Deng (another proponent at FAC) are also cited in another dispute, again for lack of balance in coverage of another Microsoft product. This is beginning to look very suspicious indeed.


 * Can these concerns be addressed? Sure, but POV, especially on such a well-developed article, can be deep-seated and difficult to spot.  As Quantumsilverfish pointed out at FAC the article really needs thorough review from fresh blood, especially in the light of the other dispute mentioned above.  In that respect at least this review process may help since it has been listed on so many wikiprojects.  Why wasn't the same done for the original nomination?  It would certainly have helped get some more independent input into the process.


 * With all this in mind it is difficult to not conclude that there are legitimate concerns over NPOV and consequently it is not FA standard (or even GA standard) regardless of whether it got through some debate once. I see the convention here is to try and address issues as opposed to simply delisting articles.


 * Here there is a lot of work to do: it isn't a case of referencing a few negative reviews, but given there are concerns of editorial independence the entire article must be regarded with suspicion: that means checking and validating everything in the article for balance. This is a relatively new FA and it does not appear to me that it was NPOV even when advanced to FA: therefore I'm tempted to say delist it now regardless of convention and take it back to FAC when it is truly ready.  The editing environment on the article simply appears too hostile to expect any immediate resolution. Crispmuncher (talk) 04:22, 22 October 2012 (UTC).
 * NPOV entails giving weight in direct proportion to its coverage, which I feel that the article does. It just happened that everyone else (but Quantamsilverfish) who commented on the FA nomination seemed to have shared this perspective. I do agree that we can elaborate a little more on criticism, but that one paragraph should be sufficient. I will disclaim that I endorse Microsoft's products as a personal view, so my views here may be disregarded if necessary.--Jasper Deng (talk) 04:30, 22 October 2012 (UTC)
 * Also, please comment on content, not contributors. Personalizing content disputes is often considered disruptive editing. Mark Arsten (talk) 14:40, 22 October 2012 (UTC)


 * Keep. Hi. The nomination says delist the article because its subject has poor detection rates. Nominator has expressed these very same concerns in the FAC of the article; I showed him that no only he is right but also the article agrees with him. I said:

! Quantumsilverfish says... ! Article says...
 * "...it does not overly impact on system performance..."
 * Four source cite its low resource usage
 * Four source cite its low resource usage

AV-Test.org gives it a usability score of 5.0 out of 6.0


 * "...but has relatively poor detection rates..."
 * "This product received a protection score of 2.5 out of 6"
 * "This product received a protection score of 2.5 out of 6"

"Version 2.1 received a protection score of 3.0 out of 6"


 * "...and painfully slow scanning speed."
 * "Some full scans took over an hour on infected systems; a scan on a clean system took 35 minutes."
 * }
 * I do not see why we cannot have a featured article on a product that does not have excellent detection rates.
 * I do not see why we cannot have a featured article on a product that does not have excellent detection rates.


 * Best regards,
 * Codename Lisa (talk) 07:57, 22 October 2012 (UTC)


 * Comment the article was promoted just over a month ago. --Rschen7754 08:05, 22 October 2012 (UTC)
 * Keep We don't limit FA status to products with perfect commercial reviews. FA status is supposed to be relative to the quality of the article, not the quality of virus detection.  FA doesn't stop anyone from adding more criticisms if you think that is needed.  "Neutral" isn't an exact thing, it is a range.  If you think it needs more criticisms, adding one or two wouldn't make it overly lopsided, but it isn't overtly biased now, so no reason to delist.  Dennis Brown -  2&cent;    &copy;   Join WER 11:16, 22 October 2012 (UTC)

Additional closing note - This FAR is being closed as kept because, according to the FAR directions, "Three to six months is regarded as the minimum time between promotion and nomination". It has been barely one month since this article was promoted. Also, as a side note, this seems to be more of a content dispute than anything else, so it may be the case that one of the content dispute resolution boards may be of more help than FAR. I would suggest that both groups of editors continue to work together to come to a mutually-agreeable solution, if possible. Dana boomer (talk) 00:22, 23 October 2012 (UTC)
 * The above discussion is preserved as an archive. Please do not modify it. No further edits should be made to this page.