Wikipedia talk:Stable versions proposal/research


 * Wikipedia talk:Stable versions/selecting
 * Wikipedia talk:Stable versions/storing
 * Wikipedia talk:Stable versions/reading
 * Wikipedia talk:Stable versions/software mechanisms


 * Wikipedia talk:Stable versions/relation to other projects
 * Wikipedia talk:Stable versions/research
 * Wikipedia talk:Stable versions/miscellaneous

Average End Quality Hypothesis
One of the assumptions of Wikipedia is that continual editing by multiple users will result in a continual increase in the quality of an article. This has proven true as a general trend. However, I do not believe adequate attention has been given to important exceptions, and, most significantly, to the rate of improvement in article quality.

I have done a bit of research and reflection, and I have reached a conclusion that I call the Average End Quality (AEQ) Hypothesis. It is based on the following observation:

As the quality of an article improves, the rate of improvement declines.

In other words, if Q(t) is the quality of an article at time t, then Q'(t) is positive but downward sloping, or Q"(t)<0. The graph of quality over time looks something like this:



Notice that the quality function appears to level off. This is not accidental. It is a well known fact that not all edits improve the quality of an article; some, in fact, detract from quality. Not all of these are simple vandalism that gets reverted as a matter of course. Some are subtle insertions of uncited speculation, POV, reorganizations of material that disrupt the flow of an article, bad grammar, and so on. They are not enough to have a visible effect on the upward trend of quality for new and short articles, but once an article gets very lengthy and detailed it becomes increasingly difficult to copyedit and remove errors buried deep inside the long text. As a result, bad edits are able to balance out good work and article quality levels off at a value I have termed the Average End Quality (AEQ).



Of course, editing never actually stops. The actual quality of a given article may spike above or dip below the AEQ for various periods of time. But whenever actual quality goes above the AEQ, bad editing gains an upper hand over good editing and drives it back down. Likewise, whenever actual quality goes below the AEQ, good editing gains an upper hand over bad editing and pushes it back up. In other words, if an article gets too good then most editors will declare "mission accomplished" and leave, allowing random bad edits to erode its quality; but if the article gets too bad, a large number of editors will be attracted in an attempt to fix it.

Thus, the quality of most detailed, lengthy articles oscillates around the Average End Quality:



Some might say that the AEQ is good enough, so there is really no problem. However, this property of Wikipedia results in a lot of wasted effort from editors who work hard to get the quality of an article above the AEQ, only to have it eroded down over the next few months. And, in some cases, articles that were once of a very high quality have been reduced to near incoherence. Given all this, I propose that the very best articles on Wikipedia be placed under permanent page protection. After all, the whole reason why people are free to edit articles on Wikipedia is because this policy results in an overall increase in article quality. But if we have good reason to believe that it will result in a decrease in quality for articles X, Y and Z, then it is only reasonable to place articles X, Y and Z under some sort of protection:



Such a protection policy should only apply to articles of exceptional quality, and it should not be a blanket ban on all edits; rather, there should be a requirement that any new edits must undergo some review process before being allowed. This could be the first step towards Wikipedia 1.0: At first, only a handful of articles would have the required quality to be protected, but more would accumulate over time.

I am sure this idea can spark endless controversy. Fortunately, however, it is not just a matter of opinion. There is, in fact, a very sure way to tell whether the Average End Quality Hypothesis is true or false. What articles are recognized as the very best on Wikipedia? Featured articles, of course. Let us do a survey of former featured articles and determine whether their quality has increased or decreased on average since the day they were featured. If it has decreased, then it is true that continual editing usually lowers the quality of our best articles, and therefore it is a good idea to implement some sort of protection policy on them.

Note that I am only arguing for stable versions in the case of the absolute best articles that wikipedia has produced. The ratio of featured articles per total articles is currently less than 0.001 - and since the number of total articles goes up much faster than the number of featured articles, this ratio will get ever smaller. Locking such a tiny fraction of articles will make no difference to the overall operation of wikipedia. -- Mihnea Tudoreanu 12:33, 29 December 2005 (UTC)


 * A couple of comments here: these are very interesting thoughts, but it's difficult for me to see "article quality" as a monotonic function; the injection of one bad fact (maliciously or inadvertently) can have a dramatic impact on the usefulness and trustworthiness of an article -- a drop of poison can spoil the well, so to speak. Also, my experience with certain articles does not actually match your graphs; for example Rock and Roll / Rock music has been a butchery after some well-meaning folks took a mostly-OK article and went overboard with sub-articles and a main article split.  Finally, it's important to recognize that all the up-and-down fluctuations you picture (indeed, any change to the Wiki) represent effort expended by editors.  We often find ourselves pushing the boulder halfway up the hill, and turning around only to have it roll most of the way back (I'm not talking about simple vandalism that can be easily reverted but big ugly dump-edits that require major effort to clean up, fact-check, and integrate).   One side benefit of a Stable Versions process is that it puts chocks behind the boulder every so often.  Jgm 15:11, 29 December 2005 (UTC)
 * Your analysis is similar to the sentiment of most of the experienced people invlolved in FAC and FARC. In fact it's one of Raul's laws. Featured articles tend not to improve once they are featured, in fact they generally degrade unless they have a particularly vigilant watcher or someone comes in to revamp them. It's clear then that a different system is needed to consistently bring articles above the AEQ. The current system works well to produce a mass of information, but on the whole is probably not able to take it higher. Many of the things mentioned here such as formal expert review may be able to lift the quality consistently above AEQ. - Taxman Talk 15:19, 29 December 2005 (UTC)
 * I don't really believe the AEQ is true in general; I believe any article can be substantially improved by the right editor, but just that it sometimes takes a while for that editor to show up. Additionally, any scheme which attempts to protect the working version will effectively prevent cooperative work on an article. If a Honduran scientist adds technical details to global warming with terrible grammar, and this is subsequently cleaned up by an American user, we have a net improvement that could not have been accomplished in one edit by either user. This is a fundamental advantage of Wikipedia. Stable versions allow these "refinement" stages to complete before publication while still giving access to the information to people who would rather deal with the temporary lower quality for more cutting-edge content. Deco 19:31, 29 December 2005 (UTC)


 * I think that it definitely takes work to keep articles at high quality, and I like the graphs as a way of expressing that. I agree with most of what's been said in this section, in fact.  But I'm concerned that whoever the "guardians" of the stable version are will use their status and sense of self-importance to keep out legitimate edits because they overvalue their own ideas relative to other people's.  Some editors have very firm ideas about their own writing styles, the needs of readers, NPOV, etc. that we might not all agree with.  While the current system isn't perfect, since improvements can be reverted capriciously, at least those edits have a chance.  I'm not sure what the solution is.  After reading this section, I've become less opposed to stable versions.  But I am concerned about this. I just thought I'd lay out my ideas and see if anything useful comes from them.  Keep up the good discussion, Dave (talk) 04:25, 30 December 2005 (UTC)


 * Stable versions only entails protection of the stable version. It should only be necessary to edit the stable version (if at all) for small things that are clearly wrong like inaccuracies, misspellings, and broken links; anything contentious shouldn't be changed there anyway. Any other changes will be incorporated into the working version via the current process and subsequently published in the next stable version. And I stress again that the two versions should be viewed as just two different forms of the same article which a reader can freely choose between without emphasis on either. Deco 08:00, 30 December 2005 (UTC)


 * Right. I'm worried about the social implications of having an officially sanctioned version.  People have accused me of ruling Libertarianism too tightly, because I felt like I had some kind of ownership of it after pushing it to FA status (that was six months ago... since then, I've let the article do its own thing until a few weeks ago).  I'm concerned that such situations would become more common, not because of any technical limitations of stable versions, but instead because of what it will mean to the (human) editors involved. Dave (talk) 08:25, 30 December 2005 (UTC)


 * There will always be edit wars. If there is no consensus, then that article is not ready to be stable. If anything, it can help force the article to reach NPOV more quickly. -- Zondor 08:45, 30 December 2005 (UTC)


 * That's actually my concern. I've seen situations where users have used the drive for a stable version as a reason to reject good changes, and I'm worried about codifying that.  Dave (talk) 08:53, 30 December 2005 (UTC)


 * Your concern is well founded. But however "good" it is, it could be POV. -- Zondor 09:58, 30 December 2005 (UTC)


 * This is an interesting point, and I've been guilty of possessiveness on occasion too, but I think the effect will be diminished compared to the drive for FA status, because there are two versions - during publication of a stable version there could be an opportunity to cut out incomplete or controversial parts, or to mix and match parts of working versions, so there's less temptation to view destabilizing edits aggressively. The way I imagine it is there could be a temporary freely editable "preprint version" that is assembled for publication and voted on. Of course one has to avoid the temptation of doing substantive editing in that version instead of the working version, but I think it could help.
 * As for the social implications of singling out some editors over others as owners, that is a concern. There could be a process for adding and removing owners, but considering that their "powers" would be limited to typically insubstantial stable version editing, maybe their role should be downplayed; we could call them "maintainers" or something boring like that. Deco 10:24, 30 December 2005 (UTC)


 * Without actual stats to demonstrate, AEQ seems like a convenient assumption: it will apply to some articles, but to how many, and for what underlying reasons? The statement above, any article can be substantially improved by the right editor, but just that it sometimes takes a while for that editor to show up, whether directly intended to or not, points to an aspect of "quality" that hasn't really been discussed here, writing quality. The fact that we are writing an encyclopedia, not simply collecting verifiable "facts" in a database (like IMDb), puts a lot of weight on how an article is written. The articles I've watched that have come up from stubs tend to be first an assembly of an increasing number of pieces of information, often with phases of discussion or debate between editors, and then things quiet down. What's left is a pretty much "everything's there" article that is readable, but is quite dry and cumbersome. At this point, I think we're "waiting for a writer". There are other, different skills and talents involved in copywriting, prose writing, journalistic writing, technical writing, encyclopedia writing than expertise in the topic. And writers tend to come along later, to reorganize, re-edit, rewrite, to produce a final, clear, readable product. (I can cite a number of recently celebrated WP articles that are simply not any fun to read, where your eyes do glaze over...) To put it broadly, IMO, the biggest difference between WP and Britannica at this stage is, in the end, their final product is written by skilled, professional writers. Stable versions must recognize this aspect of "encyclopedic" in whatever mechanism is considered to select and update stable articles, and it must allow revisions to be made freely made for writing quality, not just "verifiable information". --Tsavage 17:04, 21 January 2006 (UTC)


 * I think that is one of the most valid concerns I have heard.On the other hand,that's human nature and any work made by humans is vunerable to POV.I don't even think the other major encyclopedia's can work arround this.I don't think the current situation is any better now,they just fluctuate between POV's.

Like I stated before however there should be a mechanism to evaluate stable versions and change them in instances of emergencies (factual error that is too great, large amount of POV pushing,etc..)

Given that the number of edits on a stable version should be kept low,there changes should be pretty transparant. --213.224.194.71 13:25, 30 December 2005 (UTC)


 * I find these comments to be excellent, though I do disagree with protecting the article against further revisions and "gating" new edits on the article. That's what I think a stable revision would be good for: the stable revision would not be changable and then what you do is use that revision to try to gather just how good the previous articles were. A good example of this was Christmas, which was a very different revision on FAC than is currently. A revision marked as stable would have been a very good idea on this article, and we could have just done a diff to see what had changed substantially. - Ta bu shi da yu 09:18, 2 January 2006 (UTC)


 * That's a good idea. It was partially implemented a couple of weeks ago in the FA tag, but the "version" link only calls up the entire history page, it's not a direct link to the promoted version. Perhaps going throught the current 800+ FAs and putting a link to the promoted version (easily found in History) on the Talk page would provide some relatively quick and useful "data" relating to AEQ and the articles hit a peak and then slide downhill hypothesis... --Tsavage 17:35, 21 January 2006 (UTC)

A tiny bit of somewhat quantitative input: How FAs fare over time...
Maybe already done elsewhere, but from this discussion I was moved to start a simple tracking survey of Featured Articles, to see what indeed happens to them over time. I've only covered under a dozen, but I think it's already intereesting, and I don't suspect the trend will change much as the sample expands. It's here for now: Featured Article quality tracking. --Tsavage 06:24, 24 January 2006 (UTC)

Basic Wikipedia editorial stats
Having some general figures available to inform the discussion seems like a good thing. I don't know if these are readily available somewhere already, but here's a starter list:
 * Number of articles: 946,207
 * Number of stubs:
 * Number of Featured Articles: 878
 * Number tagged for editorial dispute: (POV, merge, etc.)
 * Number tagged for administrative action: {in process for AfD, redirect,...)
 * Number of unstable articles: (perhaps a function of number of edits over a period)
 * Graph of article creation and deletion over time (would probably include merges as deletions)

A current snapshot would be good, but also, having daily or weekly updates somewhere on an ongoing basis would be a useful tool as well. --Tsavage 22:59, 30 January 2006 (UTC)