Wikipedia talk:Image content guidelines

Discussion subpages
Please discuss the following issues at their subpages if they exist: Thanks. Carcharoth (talk) 23:11, 9 May 2008 (UTC)
 * Explicit sexual content (discussion)
 * Explicit medical content
 * Images of identifiable people
 * Images depicting death
 * Images depicting violence
 * Images depicting religious figures

General discussion
General discussion about the proposed guideline should go here. Carcharoth (talk) 23:11, 9 May 2008 (UTC)


 * As this is a perennial discussion topic, perhaps we can link some previous discussions, and summarize their conclusions (or lack thereof), the concerns brought up, etc... What are the issues that need to be addressed? I've been fairly happy with "Wikipedia is not censored".  I wouldn't mind the idea of having some images linked to a page instead of displayed on a page, if for no other reason than to allow someone to read the page in a library or other public place without others (especially very young children) seeing them. -- &#x2611; Sam uelWantman 23:36, 9 May 2008 (UTC)\
 * Yes, and the perennial problem with that is where to draw the line in a neutral, NPOV manner. I actually don't think it's such a bad idea, necessarily, if it would be limited to IP browsing only; individual users could set all images to default on. The intractable issue then becomes what do we tag as "non-display" - what is "NSFW?" Maybe we can come to a new consensus on that. I'm not optimistic, though. FCYTravis (talk) 23:56, 9 May 2008 (UTC)
 * There is something i still do not understand: if you are at work or in a public place and type "penis" on a computer, what do you expect to see on the screen exactly? the same applies for "illustrated list of sex positions", "mammary intercourse" or if you browse the group sex images category. It is your fault if you request this kind of information to wikipedia while at work (and shouldn't you be working and not browsing wikipedia anyway?)  &#xE01E;&#xE01E;&#xE01E; Iunaw  &#xE441;  04:20, 16 May 2008 (UTC)


 * Strongly reject. How many times are we going to have this discussion?  How many layers of bureaucracy are we going to continually add to this site before it becomes a place where it is difficult to do anything?  This is unnecessary; we have come very far on this site at this point with allowing more freedom, not less.  WP:BOLD has served us well.  There are far bigger issues on this site, namely the problems with BLPs, than creating a requirement for those of us who wish to add photographs than those who wish to add text?  This is unnecessary, and appears to be a response to Concerned Women for America than anything else.  In many ways, I feel Wikipedia is losing its way - too many of us spend far too much time on discussion pages, too much time creating more complicated reference templates, and trying to figure out ways to hamstring contributions, that in the end, the site no longer is fun.  Exactly because of proposals like this.  This contradicts existing policies and guidelines.  It contradicts what we are about.  -- David  Shankbone  23:39, 9 May 2008 (UTC)
 * I'm actually trying to reduce the endless discussions by starting this guideline. It's more a centralised place to archive discussions and stuff, rather than come to any lasting conclusions. The reason the discussions endlessly repeat themselves is because no-one actually records the arguments in a central place. The Signpost tipline page wasn't the right place to have that discussion, so I went looking for a place to have the discussion and failing to find one, I created this. As an example, the Mohammed image discussions got lost somewhere, and for the life of me I cannot find them to link them from somewhere. Anyone? Carcharoth (talk) 00:18, 10 May 2008 (UTC)
 * Is this what you were looking for Talk:Muhammad/images. Polly (Parrot) 01:18, 10 May 2008 (UTC)
 * Or maybe this one. Polly (Parrot) 01:36, 10 May 2008 (UTC)
 * I mostly agree with David. It seems to me that this has the potential to simply increase the amount of arguing and edit warring over image use. When it comes to defining objectionable images, it is very subjective. Any attempt at specifics would probably be met with a dozen counter-examples and vagueness will just lead to arguments and wikilawyering. Mr.  Z- man  23:51, 9 May 2008 (UTC)
 * "arguments and wikilawyering" - isn't that what we already have? Carcharoth (talk) 00:18, 10 May 2008 (UTC)
 * Yes, however, I think creating a guideline based on vague subjective opinions or strict definitions may make it worse instead of better. Mr.  Z- man  00:23, 10 May 2008 (UTC)
 * Would you support an archive of past discussion to help guide people in future discussion? And what would be the tag for that? Would it still be a guideline? A FAQ? An archive? Carcharoth (talk) 00:25, 10 May 2008 (UTC)
 * David, you mention BLP issues. On the front page, I noted "Images of identifiable people" (ie. personality rights). Is that a BLP issue? If so, just a short summary here, pointing to the relevant part of the BLP policy will so. Does WP:BLP already cover how to handle images of living people? If not, why not? If this needs further discussion, I suggest a subpage, similar to the one I created for the discussion of explicit sexual content. Carcharoth (talk) 00:22, 10 May 2008 (UTC)
 * I mention the vandalism and false information that can languish on BLPs, not images. I simply could not imagine every photograph of a living person needing community consensus to be used.  That does not make sense to me, either.  Personality rights come in to play with the selling of a product.  So even though my image of Madonna is licensed GFDL, people could not use it to sell vaginal cream with my photo of Madonna on it - she owns her image.  I own an original photograph of her, that I have licensed so others may use it (and a damn good one, I think).  We aren't selling products - we are illustrating articles.  -- David  Shankbone  01:25, 10 May 2008 (UTC)
 * I'm not talking about your pictures, David, I'm talking about pictures of borderline BLPs who didn't want their photograph taken. If a living person asks for a picture of them to be taken down, what should our response be? Carcharoth (talk) 07:36, 10 May 2008 (UTC)
 * If the photo is generally replaceable and not of distinct historical value, I think we should take it down, unless there's a compelling reason not to do so. Yes, technically if you're in a public place you can be photographed without recourse (at least in the United States), but I think this falls under the "why needlessly piss someone off" category. If some average Joe photographed on a bike objects to their photo being used on the Cycling article, for example... well, it's trivial to go get/find another photo of someone on a bike. I don't think this applies to non-borderline biographical pages - generally encyclopedic people should have a photograph with their bios. FCYTravis (talk) 15:27, 10 May 2008 (UTC)


 * Oppose. Contrary to its aims, this suggestion seems to me to only have the possibility of increasing heated debate over content issues.  I think the community handles such questions reasonably well as it is.  Ford MF (talk) 20:52, 10 May 2008 (UTC)
 * I'm a bit torn here, because I think the proposed guideline at the moment does a good job of summarising some image-related stuff that is spread around on different pages, but, having rummaged around various places, I now realise that trying to create a centralised archive of such debates is an endless Sisyphean task. I recently found User:Markaci/Nudity, a collection of nude images that casts quite a broad net, but that is interesting for its demonstration that nudity appears in images in many forms. It is a rather old list, though, and it hasn't been maintained or updated for several years (the user appears to have left). An equivalent list today would be much larger. Carcharoth (talk) 22:10, 10 May 2008 (UTC)

NOT
I've got a serious concern that whatever we put here goes against the spirit of Wikipedia not being censored. We have articles on controversial subjects, that some media may censor. That probably makes us as respected as we are today. Without covering these topics, we wouldn't gather as much free knowledge as possible.

To consolidate these articles, we need pictures, and sometimes these pictures may go against peoples personal views, or tolerance levels, but they add to the overall goals of Wikipedia.

Unless an image, or other media file is illegal, we shouldn't be stopping it's use in the encyclopedia.  Ry an P os tl et hw ai te  01:13, 10 May 2008 (UTC)

I agree. We already have guidelines for censorship here, which has always been at NOT#CENSORED. Wikipedia is not censored for good reason. No single group can get a leg up on dominating content here. Once we open up the floodgates of only allowing certain kinds of content and people find out about it, we're going to start having to deal with people who would have us become Conservapedia. Unless it is illegal, what reason is there not to use it in the encyclopedia? Celarnor Talk to me 01:24, 10 May 2008 (UTC)

That said, of course, if someone wants an encyclopedia that doesn't have any depictions of religious figures, they can fork the encyclopedia and change the articles they see fit to be in line with whatever policies they choose to adopt. Similarly, if someone wants to create an encyclopedia that is completely without reference to reproduction, then they're free to do so. The main project, however, should never adopt any form of censorship. We aren't subject to the whims of evangelicals, muslims, or any other group. If they want a wikipedia that is, they can fork ours and make their own.Celarnor Talk to me 01:59, 10 May 2008 (UTC)

Misunderstandings
I think people have misunderstood the purpose I had in mind when starting this page - it was mainly meant to provide a centralised place for discussions. In other words, this is not about censorship, but about (a) centralising various image discussions where the content was controversial; and (b) extracting and documenting any common or perennial arguments. This guideline could be used to reinforce NOT if so desired, but the current form of that section is rather broad and does recognise that some content is removed, so guidelines on what is normally removed may still be helpful. Failing that, some pointers to what was removed or kept in the past will be helpful, I think. My concern is that the debates swirl around certain cause celebres and other stuff gets missed. The recent (still ongoing) controversy was about the music album depicting a naked child on the front cover. Another debate, which I copied to here, has links to images that are far more explicit than the naked child one, but that debate has had less imput so far (though some people have commented). Would the people commenting here like to look through those (sometimes explicit) images and see where they draw the line? Does the educational value of the pictures outweigh other concerns? Carcharoth (talk) 07:35, 10 May 2008 (UTC)

Previous discussions
Please help out here by providing links to previous discussions. This section will be very incomplete unless people go looking for the previous discussion! Carcharoth (talk) 07:49, 10 May 2008 (UTC)

List
Previous discussions exist for these images, but it is probably not necessary to link to all of them, as the arguments do tend to repeat after a while. If you want to track down every last discussion, then please do, but, overall, breadth of examples are probably better than details for a single example. Carcharoth (talk) 07:49, 10 May 2008 (UTC)

Last updated: 03:53, 11 October 2008 (UTC)
 * Wikipedia talk:Images and media for deletion/Archive 2
 * Wikipedia talk:Images and media for deletion/Archive 3
 * Talk:Muhammad/images
 * Village pump (proposals)/Archive 21
 * Talk:Virgin Killer/Archive 1
 * Images and media for deletion/2007 November 27
 * Images and media for deletion/2008 May 8
 * Deletion_review/Virgin_Killer
 * Talk:Tomoko Uemura in Her Bath
 * Talk:Autofellatio
 * Talk:Gerri Santoro

More examples
The most recent discussions have thrown up some interesting examples of the depiction of nude children in images and artworks. Here are the articles and the images, some non-free (hosted by Wikipedia) and some free (hosted by Commons). This is only four examples. They are all available outside of Wikipedia. There are probably more, but this should be enough to demonstrate the range of different contexts in which such images are taken and used. See User:Chzz/virginkiller for further examples, plus (for more general "bad images" and examples of the range of different "nudity" images) MediaWiki:Bad image list and User:Markaci/Nudity. I found the user pages while following "what links here" for some of the images. Carcharoth (talk) 21:41, 10 May 2008 (UTC)
 * L'Amour et Psyché, enfants (image)
 * Phan Thị Kim Phúc (image)
 * Virgin Killer (image)
 * Nevermind (image)

censored

 * This is, frankly, cobblers. Isn't it about time to say that Wikipedia obeys the laws where the servers are located, and that some things will be removed.  The list of removable stuff includes, but is not limited to: personal attacks, BLP violations, copy write violations, guro and shock site links (in inappropriate articles), advertising, nonsense, hoax articles (not articles about hoaxes), and child porn and possibly anything that resembles child porn.  You can noodle about the meaning of porn as much as you like.  English editors have the restriction of the Sexual Offences Act 2003, editors from other nations will have similar legal restrictions.  This isn't an unusual censorship - most places have strict anti-CP laws.
 * important I'm not suggesting a big change to policy, but I would like to see something a bit clearer that stops idiots from using not censored to defend clearly inappropriate material. Dan Beale-Cocks  12:17, 10 May 2008 (UTC)


 * Agreed. The fact is, Wikipedia is censored -- we remove content that is libelous, copyrighted, inaccurate, unproven, biased or "unencylopedic." Yet whenever it comes to pornography, any reasonable attempt to address the issue is drowned out by people yelling "NOTCENSOREDNOTCENSOREDNOTCENSOREDNOTCENSOREDNOTCENSORED..."
 * The current policy is just way too broad. We need a policy that clearly states what kind of oversight is and is not acceptable. The policy used to read "Wikipedia is not censored for the benefit of minors" but was shortened during the Muhammad cartoons controversy. Although I didn't really like the old wording, I think simply cutting the policy down to four words was a mistake. We need to be clear: No notable subject matter is off-limits for Wikipedia, but this does not mean that all types of images are appropriate for Wikipedia and should not mean that reasonable precautions cannot be taken to avoid shocking readers with unexpected NSFW content.
 * There are two issues here. One is with pornography. As Jimbo Wales has pointed out, pornographic images can get Wikipedia into legal difficulties. Notably, we could wind up subject to the Child Protection and Obscenity Enforcement Act, which requires sites to keep records of their pornographic "actors." I know Jimbo is not a god, but from his background, he should know the laws governing Internet porn.
 * The other issue is with NSFW (not safe for work) content, like the naked woman who used to be at the top of the woman page. This type of material may be appropriate for Wikipedia, but sometimes there are issues regarding the picture's placement or the appropriateness of a particular image for a given article. The "NOT CENSORED" dogma sometimes gets in the way of helpful discussion to resolve the dispute.
 * So while I agree with the concept of "not censored," I think the wording of the policy has to be fine-tuned. -- Mwalcoff (talk) 22:59, 11 May 2008 (UTC)

Here or there?
User:David Shankbone move the above section from Wikipedia talk:What Wikipedia is not to try to consolidate things. But the above discussion refers specifically to the policy "Wikipedia is not censored," rather than to image content guidelines. Should the discussion, therefore, be here or at WP:NOT? Considering that much of the debate regarding images centers on the meaning of "Wikipedia is not censored," and whether that policy needs to be made more specific, perhaps we should put this page on hold and go back to WP:NOT. -- Mwalcoff (talk) 03:48, 12 May 2008 (UTC)
 * We need to keep the discussion about what we are going to include on Wikipedia together - clearly image content is related to "not censored" - the idea behind not having multi-forums is that we don't want a situation where we are parsing down the discussion to such a degree as re-litigating arguments that have only minor degrees of difference. This works in everyone's favor.  -- David  Shankbone  11:23, 12 May 2008 (UTC)
 * But I have an issue with the broadness of the policy language "Wikipedia is not censored." I'd like to have a discussion about that language specifically without having the conversation moved somewhere else. -- Mwalcoff (talk) 10:31, 15 May 2008 (UTC)

The real question: What is pornography?
I specifically uploaded images of the making of an adult film that I think demystify the process of what goes into the filming of the genre. The photos were taken at the studio of a major adult film company, with major adult film stars, directed by a major adult film director. There are a couple of users in the minority who label these photos 'hard core gay pornography' (take your pick of prejudicial term), but they are not. I would like to point readers to the Stanford University encyclopedia of philosophy to help guide the discussion. The definition of pornography they eventually arrive at is "pornography is sexually explicit material designed to produce sexual arousal in consumers that is bad in a certain way." I don't think the photographs on Pornographic film, Fluffer, Pornography, Gay pornography or any other image actively used on our articles qualify by this standard. They are clearly not meant to illicit sexual arousal, and indeed the presence of so many people in most of the photos removes for most people the ability to fantasize about the scenario. The focus in the photographs is on the ulterior actors, not the sexual acts. In this regard, the photographs are educational and don't appeal to the prurient interest, but seek to demystify and expose the process of adult film making. -- David  Shankbone  11:23, 12 May 2008 (UTC)

Wikipedia is not censored, but is it alright to allow users to self-censor?
First, I want to make it clear that I don't think Wikipedia should be censored. I think the topic for discussion is deciding if it would be alright to allow some Wikipedia users the ability to decide for themselves (and their children) not to see certain images on their own computers. To this end, I'd like to hear reactions to the following proposal:
 * 1) Wikipedia continues its current policies for deleting or keeping photos.
 * 2) Any photographs including human genitals should be displayed on article pages via a template.  The template would behave differently depending upon a user preference.  The default preference (DISPLAY IMAGES WITH HUMAN GENITALIA) would be to display the image in the article.  There would also be a setting to create a link to the images (LINK TO IMAGES WITH HUMAN GENITALIA) which would require a click on a link to get to the image page.  There would also be a setting to not display these images under any circumstance (DO NOT DISPLAY IMAGES WITH HUMAN GENITALIA). In this case, there would neither be an image or a link shown on the article page, and if possible, image pages could be flagged so that they would not be displayed.
 * 3) Images in which genitals are not the focus or primary subject matter of the photo AND the genitals are a relatively small detail in the photo AND the photo illustrates humans engaged in non-sexual behavior (similar to the standards of National Geographic Magazine) could be used on any appropriate page without the template.  For example, skinny dipping, the indigenous people of Papua New Guinea, etc...
 * 4) All drawings, paintings and art works in other media could be displayed directly on a page without the template, even if the genitals are a large part of the image as long as the image is not a photograph or a media employing photo-realism. If the images are photo-realistic, they should be evaluated using #2 and #3 to decide whether to use the template.
 * 5) If any other technical means is developed to filter images based on user preference, it could be implemented as long it is functionally equivalent to this proposal.

Could something like this be a workable compromise? If not, why not? -- &#x2611; Sam uelWantman 07:49, 13 May 2008 (UTC)


 * That sounds reasonable to me. I suppose everything in Category:Bad images could be handled this way by convention.--Father Goose (talk) 08:32, 13 May 2008 (UTC)
 * I slightly modified #4 to make it a little clearer. -- &#x2611; Sam uelWantman 20:15, 13 May 2008 (UTC)


 * Good, but it requires people to create an account to use it. Mr.  Z- man  20:52, 13 May 2008 (UTC)
 * Hm... I wonder what effect there would be to require creating an account to see naked pictures. The default could be no images for Anons, and everything shown for accounts.  It might be an effective way to control adolescent vandals... -- &#x2611; Sam uelWantman 22:09, 13 May 2008 (UTC)
 * There's already a process by which people can setup their accounts so as not to show images in Category:Bad images so this discussion is really of no utility. Also, I'm a little disappointed to see this discussion once again devolving into a way to overturn WP:NOTCENSORED. "Not censored" means "not censored". You're not going to gain a consensus for censoring Wikipedia for IP's, so get over it. --Steven J. Anderson (talk) 23:01, 13 May 2008 (UTC)
 * There is already a process? What is it?  Perhaps all that is needed is publicizing that process. -- &#x2611; Sam uelWantman 00:57, 14 May 2008 (UTC)
 * I wrote a user script so that images on the Bad image list won't be loaded on any page, including the ones they are specifically allowed on, User:Mr.Z-man/badimages. A similar script could be written to get images from the category, but the bigger the category hits, the bigger the effect on page load time, as it has to compare every image on the page with every image on the list. Mr.  Z- man  01:26, 14 May 2008 (UTC)
 * I stand corrected. It's the list, not the category. --Steven J. Anderson (talk) 02:24, 14 May 2008 (UTC)

I broadly agree and think that while we should not directly censor Wikipedia (beyond reason, e.g. not putting in profanity for its own sake), we should offer methods of self-censorship. It means that while people in general should see everything, logged-in users might decide to censor themselves from seeing something they would find distasteful. As it affects only them, it would not amount to censoring Wikipedia, but merely helping people out in their personal decisions to avoid particular content.

It seems to be that this would be best achieved through a Gadget and templates (think as "   ") on a page - that way a script might run simply by hiding or collapsing the content to be censored without use of some particular list, and images which are not specifically "bad images" (e.g. the image of Muhammad Image:Maome.jpg) could be collapsed or hidden under different subcategories based on additions to the script that might simply call a censor function to be run on that particular class. Nihiltres { t .l } 17:27, 14 May 2008 (UTC)


 * While a gadget would make it easier to turn on such a feature, it's still going to do nothing to silence people who think nudity is a sin, and I doubt casual users who might want to blank out such images will even realize the gadget is there. Do programs like Net Nanny have the ability to block sites on a per-page basis?  That might be the most realistic option.--Father Goose (talk) 19:13, 14 May 2008 (UTC)

This is along the same lines of the "self-censorship" mechanism I've proposed. My idea is that any page with potentially NSFW content would say at the top, "This page includes an image of . To view the page without the image, click here ." Clicking on the link would bring up exactly the same page but without the image in question. The advantage of this idea is that it would give the user an opportunity to make a decision page-by-page. This is important, IMO, because NSFW content is often found where users may not expect it, such as at man.

The example I always use is when I was reading the article on Oscar Wilde which said he was arrested for buggery, a term unknown to my American ears. So I clicked on it and was led to anal sex -- at work! Fortunately, there were no images "above the fold," but it goes to show that NSFW content can be found anywhere. Many users would fail to turn on the "no genitalia" function, thinking that the pages they're going to visit won't have genitalia -- only to find that NSFW content in unexpected places.

That's why I support a solution where the decision to self-censor is made on the page visited rather than ahead of time. The user-settings option also may prove somewhat difficult to implement because we'll have to decide what should be a category, how many categories there should be and what categories a particular image falls into, plus all of the difficulties inherent in a solution based on tagging individual images. Not to pile on Sam's good idea, but it also would be of no help to unregistered users, whom I would think would make up a good share of casual readers likely to run into unexpected NSFW content.

On the other hand, my proposal admittedly suffers from the weakness of requiring a policy on what counts as "potentially NSFW" and requires page-top disclaimers, which really rile some people up for some reason. IMO, we can deal with the problem of what counts as potentially NSFW simply by adopting the policy of a respected and appropriate organization. When my workplace had sexual-harassment training (or sexual-harassment-avoidance training, if you will), we were given a definition of "NSFW" that I believe was adapted from Fark.com, the supposed inventor of the NSFW acronym. If the Fark definition was good enough for a firm that advises businesses on workplace policy, it may be good enough for Wikipedia. -- Mwalcoff (talk) 10:50, 15 May 2008 (UTC)


 * Perhaps I'm being mislead somehow, but you say that decisions to self-censor should be made on each page rather than ahead of time, on the basis that a link to a new page might be surprisingly NSFW? This seems counterintuitive. Nihiltres { t .l } 21:26, 19 May 2008 (UTC)


 * My suggestion is that NSFW content should be placed "below the fold," so you have to hit "Page Down" to see it on a standard landscape monitor. The warning note would be at the top. -- Mwalcoff (talk) 00:33, 20 May 2008 (UTC)


 * Options to not see an image No disclaimers in articles  &#xE01E;&#xE01E;&#xE01E; Iunaw  &#xE441;  14:25, 28 May 2008 (UTC)

How about this?
I have an idea which I hope may solve (or at least reduce) the ongoing debate about whether or not Wikipedia should be censored (it shouldn't be). If there is an image which may be offensive, why not hide it at first and put a little (show) button next to it, like category boxes (if thats what they're called). That way, it (technically) won't be censored, but people who don't want to have a penis (or whatever) staring them in the face when they're trying to read an article don't have to. Ilikefood (talk) 22:42, 15 May 2008 (UTC)
 * This is exactly the kind of thing, like the proposal in the section above, that has been repeatedly proposed and rejected by community consensus. --Steven J. Anderson (talk) 23:03, 15 May 2008 (UTC)
 * Well, I think it's time to revisit this "community consensus." It's obviously not a consensus when we've got users chiming in on several pages (Village Pump, Reference Desk, this page and elsewhere) questioning it. I think we have a silent majority of users (not necessarily active editors) who support a reasonable non-censorious solution to NSFW content and an extremely vocal minority of editors who dogmatically try to shout down any compromose. -- Mwalcoff (talk) 23:27, 15 May 2008 (UTC)
 * While I agree that we should provide a means to self-censor, the methods you suggest, Mwalcoff, are out of the question. Disclaimers in articles and content forks are both things which are broadly accepted as not good practice. If we really need a method that anyone (including anons) can use, try thinking outside the box: all users who have access to JavaScript have access to Common.js. Nihiltres { t .l } 02:54, 16 May 2008 (UTC)
 * Not to mention that, Ilikefood, direct censorship isn't the answer as we then are making value judgements for people directly, which is unacceptable. Nihiltres { t .l } 02:57, 16 May 2008 (UTC)
 * I never said to make any kinds of moral judgements, just to have an option to hide whatever kind of images you may not want to see. Perhaps there could be an option to not immediately display images from certain categories? You're absolutely right that we shouldn't make judgements for people. They should choose what to display themselves. Ilikefood (talk) 18:35, 19 May 2008 (UTC)
 * Any solution has to be super-simple for people with limited computer skills to use. I don't consider myself in that category, but I have no idea what "common.js" is. I'm open to all suggestions, but it's got to be simple. -- Mwalcoff (talk) 03:42, 16 May 2008 (UTC)
 * MediaWiki:Common.js is the JavaScript applied to all users. It would mean that no user would have to install any code whatsoever - for people with JavaScript, it would "just work". Nihiltres { t .l } 17:33, 16 May 2008 (UTC)
 * Feel free to suggest it again, but don't be surprised if it gets rejected again. From what I've seen, it's quite a bit more than a vocal minority that rejects any form of censorship on Wikipedia, including what you propose: "opt-in" schemes for content some may find offensive.--Father Goose (talk) 02:06, 20 May 2008 (UTC)

Opinions requested
I tend to agree that any type of "opt-in" system will not garner support. I would like to hear opinions about the concept of having an "opt-out" system of some sort. I don't want to design it here and now. I'd rather that we discuss this in more abstract terms. For instance, if you object to any form of an "opt-out" system, why? Or what features would it have to have, or what criteria would it have to fulfill for an opt-out system have to have for you to find it acceptable? --&#x2611; Sam uelWantman 05:31, 20 May 2008 (UTC)
 * I'll start this out with my own opinion, which is that I would find an opt-out system acceptable if there were clear guidelines for what could be filtered, such that it won't provoke POV battles to determine what gets included and what doesn't. I'd also want to see those filters aimed at populations that are not determined by political or religious considerations.  For instance, I'd personally like a filter for young children that filters out sexually explicit images.  It is not because I find them objectionable, but because I spend a bit of time with my 8-year old godchild and would like to make it easy to set it up where she could go to any page without me wondering what she might stumble across.  Similarly, a filter for NSFW content might be reasonable. I'd like to be able to switch these filters on and off quickly and easily.  I don't mind that they wouldn't work for anon editors.  I'm fine with presenting the feature as a benefit of creating an account.  -- &#x2611; Sam uelWantman 05:31, 20 May 2008 (UTC)
 * I understand your concern and appreciate why you'd like a "filter for young children." However, I think it might be impossible to determine what is "appropriate for children" given the widely varying opinions on that subject. NSFW, on the other hand, is a term invented by Fark.com, or so they say, and seems to have a rather consistent meaning. There's less value judgment involved in deeming something potentially NSFW as opposed to inappropriate for a particular age group, I would fathom. -- Mwalcoff (talk) 05:06, 21 May 2008 (UTC)
 * I would not define it as a filter for young children. That would be too POV to make workable.  Instead there would be clear criteria for certain types of pictures.  Further up the page, I mentioned photography of human genetalia as one such class.  If we define the class well, it would be clear what belongs and what doesn't.  I think emulating the standards of National Geographic would be workable.  It would then be up to the individual to decide if the class was appropriate for children, for work or whatever... -- &#x2611; Sam uelWantman 09:10, 21 May 2008 (UTC)
 * Could you please clarify what you mean by "opt-in" and "opt-out"? By "opt-in" do you mean that no one has images censored except those who decide to have them censored?
 * I'd also like to point out that using MediaWiki:Bad image list as a working list of images to be censored has a fundamental flaw; it was never created for that purpose. It is, by no means, a comprehensive list of objectionable images, never has been, and was never meant to be. The only reason that list exists is to create a mechanism for fighting vandalism by establishing a mechanical way to prevent certain images from being posted to most pages on Wikipedia. It works like this. Suppose someone vandalizes your user page or one of your favorite articles by persistently putting a large picture of a penis on it. You can ask an admin to add the image to the bad image list. If he does that, this prevents the images from being added anywhere except to the articles that are specifically mentioned as acceptable for its use. This (as I understand it) is usually done as a response to vandalism that has actually happened. Sometimes images are removed from the list after the threat of vandalism has passed. It's at least theoretically possible that if someone started persistently vandalizing your page with a picture of, oh say, Bill O'Reilly, that you could get it added to the list. The script that's used to block images on that list puts it to a use unrelated to the purpose for which the list was created and is really just a fast-and-dirty way to give users a way to avoid some of the images that they might object to. --Steven J. Anderson (talk) 06:40, 21 May 2008 (UTC)
 * Opt-in means that the default is that you would not see all images, and you would have to choose to see them. An opt-in system implies that the default is censorship, and this is routinely denounced.  Opt-out means that the default is that you would see everything unless you had chosen not to.  It would be a personal preference that you could set. -- &#x2611; Sam uelWantman 09:10, 21 May 2008 (UTC)
 * Thank you for clarifying. In that case, I am entirely opposed to the plan you propose and have no doubt that it would fail an RfC as every similar proposal has failed. --Steven J. Anderson (talk) 18:08, 21 May 2008 (UTC)
 * I am trying to discuss this issue in general terms, and have asked for comments as to WHY any opt-out sytem would be objectionable, or what would it take to make it acceptable. Can you elucidate your objections? -- &#x2611; Sam uelWantman 19:22, 21 May 2008 (UTC)
 * Please read WP:NOTCENSORED. The change you advocate would trash this long-standing policy that has served Wikipedia well for years. --Steven J. Anderson (talk) 20:05, 21 May 2008 (UTC)
 * As noted above, I believe the current wording of the censorship policy was adopted in response to the Muhammad cartoons controversy -- it had little to do with NSFW imagery. It is not at all clear what "not censored" really means. After all, Wikipedia clearly is censored in many ways already. That's why I think we need more discussion and a better-written policy on "censorship" before we can really address image content guidelines. -- Mwalcoff (talk) 23:08, 21 May 2008 (UTC)
 * I've read WP:NOTCENSORED several times, and have been part of previous discussions several times over the past 4 years. I am a firm believer that Wikipedia should not be censored.  However, I don't believe that giving individuals an easy option to filter out images that they choose not to see would constitute censorship.  On the contrary, I am advocating an opt-out system as a way of bolstering our non-censorship policy.  With such a filter we  would be able tell people who don't want to see certain images that they have an easy option for filtering them from appearing in their account.  I don't care what people choose not to see for themselves as long as they don't impose their opinions on everyone.  And even though I personally have no problems with ANY image that has ever been in Wikipedia, there are occasions and situations where I might choose to filter out images temporarily.  So I don't see how the "not censored" policy would change.  Why do you think that offering people a way to filter images for themselves is censorship? -- &#x2611; Sam uelWantman 00:38, 22 May 2008 (UTC)
 * First, in response to Mwalcoff, you have gotten the history of the censorship policy exactly backwards. It used to read "Wikipedia is not censored for children." Then, after the controversy surrounding the Muhammed cartoons, the policy was expanded to read simply "Wikipedia is not censored."
 * Second, in response to Sam, I apologize, I think I got turned around by the terminology, As I now understand it (correct me if I'm wrong), what you're advocating is an "opt out" system, meaning that users who wish not to see sexually explicit images can "opt-out" of seeing them. Although, without speaking for anyone else, I find that fundamentally unobjectionable, I'll point out that the best and simplest "opt out" system is to simply stay away from articles like penis and pornography. Also, as I pointed out above, there already exists a crude opt-out system in the form of a script that blocks images on the bad image list. I also pointed out the limitations of that system. Further, in order to get a more thorough system like that to work, someone would have to come up with a list of blockable images, someone would have to write the code to block it, someone would have to maintain the list as new images are posted, someone would have to mediate disputes about what belongs on the list and what doesn't. In short, someone or some group of people would have to become Wikipedia's official censor. Either that, or face endless bickering about what belongs on the list. And probably both. --Steven J. Anderson (talk) 01:30, 22 May 2008 (UTC)
 * Steven -- I knew precisely the history of the "not censored" policy. I meant that the current, four-word policy was a result of the cartoons controversy. Before then, it said, "Wikipedia is not censored for the benefit of minors," a policy that might have allowed for opt-out self-censorship to avoid NSFW images. The current policy is so broad and vague as to lack a clear meaning. -- Mwalcoff (talk) 01:39, 22 May 2008 (UTC)

(outdenting)Steven, Yes, someone would have to write the code, maintain a list, mediate disputes, etc... Isn't that what we already do in every corner of Wikipedia? As for making this workable, that is why I think any standards would have to be objective NPOV criteria that is not based on what is objectionable. What I am certain of, is that any such system would inspire many people, like you, to keep an eye on it so that it did not expand beyond being a system for self-censorship. As you say, a crude version of this already exists as a script. All I am advocating is that this become easy enough to use so that it can be turned on and off with a click on a preference page, and that we publicize it as a way of deflecting calls to actually censor. -- &#x2611; Sam uelWantman 05:22, 22 May 2008 (UTC)

Back on a previous discussion about image censorship, I suggested a simple multiple opt-out system. Via the gadget tab of the user preferences page, users could enable specific image opt-out settings. These would each block images belonging to a specific category. This has a number of advantages associated with it. Firstly, by using the existing categories system, setting up a block list would be simple. Secondly, maintenance of the block list would be as simple as adding an image to a category, and could be left to the users themselves. Thirdly, specific block options could exist for specific preferences. Finally, by having this as just yet another user account feature, this could be enabled by request on access by certain IPs, which could help prevent problems with wikipedia access within libraries and schools. What do people think? LinaMishima (talk) 01:41, 8 December 2008 (UTC)