Commons talk:Sexual content/Archive 4
USC 2257
[edit]Do we have to mention that law in the policy or not ? --TwoWings * to talk or not to talk... 07:14, 11 May 2010 (UTC)
- 2257 is... messy. Does WMF qualify as a "secondary producer"? If so, the resultant record-keeping requirements would have an overwhelming chilling effect on new contributions, and the penalties are very heavy. However, the definition described in the WP article seems to imply that there's only a requirement for commercial distributors, and only for photographic works. Constitutional challenges to the law have failed so far, despite an initial victory in the Sixth Circuit. What's the deal here? Dcoetzee (talk) 07:41, 11 May 2010 (UTC)
- Well I also thought that WMF couldn't be considered as a secondary producer. But since the recent evolutions, I thought it did. So where are we now ? If this law doesn't apply to us, why do we have to change our habits and procede to massive deletions of photos showing sexuality ? --TwoWings * to talk or not to talk... 08:58, 11 May 2010 (UTC)
- Jimbo made references to the "2257" (thanks for the link, it's hard for us non-US people following discussions where codenames are used), but only as a point of reference. Has somebody stated that the law concerns us? I think WMF should raise the issue if necessary, they have the responsibility and (access to) legal expertise. --LPfi (talk) 09:23, 11 May 2010 (UTC)
- AFAIK, nobody has said so, but it's also a bit besides the point. Jimbo made a number of references to 2257, but actually was talking about "images that trigger the 2257 record-keeping requirements". The criteria that determine this triggering and which he put on this page in his rewrite are actually defined in 18 USC 2256, not 2257. The 2256 definitions do not apply to drawings, cartoons, sculpture, and the like. (Compare Senate Report 108-002 and [House Report 108-066.) For child pornography, there's an additional definition in 18 USC 1466A. It does cover drawings, paintings, and so on, and covers also depictions of imaginary people, but has an exemption for works of "serious literary, artistic, political, or scientific value". Lupo 14:41, 11 May 2010 (UTC)
- Jimbo made references to the "2257" (thanks for the link, it's hard for us non-US people following discussions where codenames are used), but only as a point of reference. Has somebody stated that the law concerns us? I think WMF should raise the issue if necessary, they have the responsibility and (access to) legal expertise. --LPfi (talk) 09:23, 11 May 2010 (UTC)
Whether or not we are required to do record keeping is still highly disputable, and not something that we as a community can determine. Many people have informed Mike Godwin of the 2257 record keeping act and he has not once come out and stated that we as a project require such record keeping. Thus until the foundation is sued, or Mike says otherwise, we do not require recordkeeping. And like LPfi stated, nor did Jimbo state that we were required, he was just using the requirements of that act. TheDJ (talk) 15:41, 11 May 2010 (UTC)
- I agree that we should not concern ourselves with 2257 without a specific statement from Mike Godwin saying that we ought to, but I really wish he'd say something about it. Mike always says as little as possible - good trait for a lawyer, frustrating for us. Dcoetzee (talk) 23:43, 11 May 2010 (UTC)
The Case for Using USC 2257 on Wikimedia Projects
[edit]A few of the advantages of voluntarily adopting USC 2257 record keeping are:
- It's a proven system of record keeping that verifies information like names of subjects, stagenames, date of birth, name of photographer,
consent forms, and the location and date the photos were taken. - The legal responsibility for the accuracy and content of 2257 records remains with the record holder, and personal identifying information of the subjects of the photos (and the legal responsibility) remain off-wiki.
- It fulfills the licensing requirements of creative commons, saying that our images must be made available for commercial use, however currently our pornographic images CAN NOT be reused legally in the US for commercial purposes because they lack USC 2257. This falls way short of our "free content" ideals (as well as Commons:Licensing).
- All primary producers of pornographic images in the US MUST keep records, even if the images were uploaded to Commons. For this reason, pornography transferred from Flickr without 2257 should not be allowed.
- I would appreciate help developing this proposed policy. Thank you, Stillwaterising (talk) 06:22, 12 May 2010 (UTC)
- I strongly agree that 2257 documentation is something we should obtain whenever possible - but requiring it for all sexual photographs would just exclude too much content by raising the barrier to contribution too high. Dcoetzee (talk) 07:28, 12 May 2010 (UTC)
- Adapting that policy would mean deleting a lot of existing content. It would also make it more difficult to get some valuable content. If I contribute sexual content I prefer doing so anonymously and not keeping any documents about the participants (other than what I might put on the description page). The procedure would not benefit at all if the records are not checked by Commons: it is as easy to claim the records are kept somewhere as to lie about the ages and consent without that procedure.
- Of course following the official procedure has the benefit of most such content (probably) being legal to host also in the future (if the record keeping requirements are followed in practise), but a recommendation to that end would suffice. And most probably people will move without notice and the records disappear.
- If Commons takes the responsibility of archiving the records, then all the details will be readable by a number of persons here (and by the "Attorney General"). Although I do not mistrust these people in general, there is no guarantee no one would leek sensitive information (under special circumstances).
- --LPfi (talk) 07:45, 12 May 2010 (UTC)
- It's been confidently stated that we are under no legal requirement to implement § 2257. Am I correct in assuming that this is because § 2257 regulates commercial distribution of sexually explicit images, and we are not a commercial site? --JN466 12:48, 13 May 2010 (UTC)
- We distribute images that can be used commercially. Still we are not the end-user here. Esby (talk) 13:05, 13 May 2010 (UTC)
- It's been confidently stated that we are under no legal requirement to implement § 2257. Am I correct in assuming that this is because § 2257 regulates commercial distribution of sexually explicit images, and we are not a commercial site? --JN466 12:48, 13 May 2010 (UTC)
- --LPfi (talk) 07:45, 12 May 2010 (UTC)
- We are volunteers, not the lawyers of Wikimedia. We should do nothing here without legal advice. There could be unintended consequences of any undertakings. Jehochman (talk) 13:27, 13 May 2010 (UTC)
- As I understand it, Mike Godwin has said it doesn't apply to us. Adam Cuerden (talk) 14:06, 13 May 2010 (UTC)
- User:Mike Godwin needs to put some input into this. There's a difference between required by law and voluntary use. The case for the advantages for voluntary use is strong, although it would be inconvenient. Images proven to be produced before July 3, 1995 would not need this information. - Stillwaterising (talk) 14:24, 13 May 2010 (UTC)
- Where are you getting the 1995 date? The statue text provided in the links above gives a date of November 1, 1990. Still, it is an important point to recognize that the "2257" law does not apply to all images, only to photos/videos and only to those produced after a given date. From my (non-lawyer) reading of the law, it would not apply to the WMF, as they are engaged in "transmission, storage, retrieval, hosting, formatting, or translation ... without selection or alteration of the content", an exempted activity. But it would apply to any US-based editors who upload images, as they are engaged in "inserting on a computer site or service", an activity covered under the law. --RL0919 (talk) 14:44, 13 May 2010 (UTC)
- There's many sources for the July 3, 1995 date, one of which is here. All images "pornographic" images (and I do not wish to go into what is or isn't pornography here) also violate COM:PS#Required_licensing_terms which requires "free reuse for any purpose (including commercial)." Since these images can not be reused commercially in US without 2257 information, this essential licensing condition is not met. - Stillwaterising (talk) 16:39, 13 May 2010 (UTC)
- Where are you getting the 1995 date? The statue text provided in the links above gives a date of November 1, 1990. Still, it is an important point to recognize that the "2257" law does not apply to all images, only to photos/videos and only to those produced after a given date. From my (non-lawyer) reading of the law, it would not apply to the WMF, as they are engaged in "transmission, storage, retrieval, hosting, formatting, or translation ... without selection or alteration of the content", an exempted activity. But it would apply to any US-based editors who upload images, as they are engaged in "inserting on a computer site or service", an activity covered under the law. --RL0919 (talk) 14:44, 13 May 2010 (UTC)
- User:Mike Godwin needs to put some input into this. There's a difference between required by law and voluntary use. The case for the advantages for voluntary use is strong, although it would be inconvenient. Images proven to be produced before July 3, 1995 would not need this information. - Stillwaterising (talk) 14:24, 13 May 2010 (UTC)
- The "Required licensing terms" text is about licensing, not about non-copyright restrictions. That some users in some country are prohibited from using some Commons' content does not mean we should not host it (USA is in no special position regarding reusers). --LPfi (talk) 19:38, 13 May 2010 (UTC)
- I agree with Jehochman here. If the received wisdom that "§ 2257 does not apply to us" turns out to be flawed, note that anyone
- inserting on a computer site or service a digital image of, or otherwise managing the sexually explicit content of a computer site or service that contains a visual depiction of, sexually explicit conduct
- can be imprisoned for up to five years for a first-time offence if they fail to comply with § 2257 record-keeping requirements. --JN466 21:54, 13 May 2010 (UTC)
- I've sent a mail to foundation-l (the Wikimedia Foundation Mailing List) asking for legal help. --JN466 22:32, 13 May 2010 (UTC)
First, I have to officially disclaim any notion that I'm acting here as a lawyer for editors -- I represent the Foundation only. That said, my view is that there is no Foundation or project obligation to keep records pursuant to the models in uploaded photographs. The obligation is generally understood to apply to the producers of such images, and we're not the producers. Obviously, those who actually produce images such as those described by Secs. 2257 and 2257A may have recording obligations, but there is no duty for us to ensure that they do keep such records. MGodwin (talk) 04:10, 14 May 2010 (UTC)
- I propose mentioning here in the policy (and perhaps in the upload instructions, too) that those uploading images they have produced have a duty to keep records per §2257. This might help ensure that the images we have here have appropriate documentation. --JN466 09:54, 14 May 2010 (UTC)
- The EFF cites some specific information at [1]:
- "The regulations imply that secondary producers are limited to those involved in commercial operations. This would seem to limit the recording requirements of secondary producers to material intended for commercial distribution and exclude noncommercial or educational distribution from the regulation. 73 Fed. Reg. at 77,469. "
- "The Attorney General has also stated that the statute is “limited to pornography intended for sale or trade,” 73 Fed. Reg. at 77,456, though the text of the statute does not make this distinction." [the EFF then goes on to say that the opposite interpretation was implied "... in briefing in Free Speech Coalition v. Holder. Defendant’s Reply in Support of Motion to Dismiss at 15., No. 2:09-cv-04607 (E.D. Pa. filed Feb. 22, 2010)."
- Also there is a previous thread where it was suggested that Wikimedia contributors of sexually explicit content are in "no immediate danger": [2]
- While this law seems to have been so poorly written that at every step the interpretation of what it means seems to be at the whim of executive branch regulators, I think we should be very skeptical about any alleged requirements for Wikimedia uploaders. Wnt (talk) 14:08, 3 June 2010 (UTC)
- The EFF cites some specific information at [1]:
- I propose mentioning here in the policy (and perhaps in the upload instructions, too) that those uploading images they have produced have a duty to keep records per §2257. This might help ensure that the images we have here have appropriate documentation. --JN466 09:54, 14 May 2010 (UTC)
Community standard
[edit]Based on some of the conversations above, I added a section "Community standard", in which I say that our community standard is not to prosecute our editors. I believe this is true based on such principles as COM:NOTCENSORED; however, as written those policies are circular, because they don't (can't) defend the right of editors to upload illegal material. So I think it's important to say here that our community standard is to avoid having obscenity prosecutions. Wnt (talk) 17:48, 19 May 2010 (UTC)
- I think the whole section on Community Standard needs to go (or be revised). It says almost nothing useful. Miller test, in my opinion so is so subjective as to be beyond useless. I prefer the Dost test, so do many courts. - Stillwaterising (talk) 23:26, 19 May 2010 (UTC)
- It seems that's a test of what's lascivious (and even a clothed person being flirtatious can be). I don't see it's relevance except for determining whether a picture of a child is porn or not (which is not what was being discussed). --Simonxag (talk) 01:38, 20 May 2010 (UTC)
- I'd hoped I'd been clearer. The objective of the section is not to explain to the reader what the Miller test is. The objective of the section is to say that if local community standards affect whether one of our editors is thrown in prison, then we want our local community standards not to prohibit anything - not a criminal sense, that is. We want the prosecutors to leave our editors alone. If there's really a great problem with so-called "obscene" pornography, we can settle the issue amongst ourselves perfectly well by our own mild processes of evaluation and deletion. And we do not want our internal processes, used to set the boundaries of an encyclopedia, to be misinterpreted by anyone to be the same as community standards for prosecution. Even if someone spams us with the crudest low-grade porn imaginable and we have to block their account, we still don't want them being jailed for obscenity. It's our problem, a content dispute, not a police matter — or at least, that's how we want it to be.
- I understand that in general, since Miller, few if any communities have actually tried to lay down a community standard for use. Nor is a "Wikipedia community" very likely to be recognized by the courts. But if we have the remotest chance to keep a bunch of goons from locking up one of our editors because some prosecutor singles out his picture from all the other explicit images, or offer him a legal argument that can be used to bargain for a lesser sentence — then by God we should do what we can. Wnt (talk) 02:53, 20 May 2010 (UTC)
- Ugh. I understand what you're trying to do, but I think the section needs to go. The policy we're discussing should be focused specifically on sexual content. Besides, it wouldn't matter either way what our stated policy is "community standards" - if a prosecutor wants to go after us, they will do. In fact, stating that we have a different "community standard" than the norm could be seen as waving a red flag in front of them. Tabercil (talk) 03:31, 20 May 2010 (UTC)
- I have to agree. This section is verbose, opinionated, and aimed at an audience who will never read it. This policy is for Commons users and administrators, not for law enforcement. What we need to emphasize here is the facts: what obscenity law is, why the situation on Commons is so murky, and what users should pay attention to when evaluating images for obscenity. I'll take a stab at this. Dcoetzee (talk) 04:16, 20 May 2010 (UTC)
- I think Wikipedia users deserve better than this. I have at least amended the text you submitted to remove what could be interpreted as voluntary approval of such obscenity standards. I still believe we should make an explicit statement of a community standard favoring freedom. Opposition to censorship has been a fundamental principle of Wikipedia from the day it began. The most censored communities don't hesitate to stand up for their standards — why should we? Wnt (talk) 08:50, 20 May 2010 (UTC)
- Nobody disagrees with obscenity law more strongly than me - I think it unjustly infringes on free speech, should be protected in all cases by the First Amendment, and that Max Hardcore's imprisonment was a grave miscarriage of justice. I likewise disagree with a lot of copyright laws, but we still have to follow them or risk exposing the WMF and content reusers to legal risk. I'm fine with the present wording, as I felt my previous wording was weak anyway. Dcoetzee (talk) 19:02, 20 May 2010 (UTC)
- We should also be mindful that, as pioneers of a new frontier, we are going to get sued by someone for some reason at some point. And when that inevitably happens over our sexual content policy, there's a fair possibility that it's going to make its way to the Supreme Court and become a landmark case with precedent capable of altering the Internet as we know it. Only one thing is for sure: we've got one hell of an important juggling act in our hands right now. — C M B J 10:51, 21 May 2010 (UTC)
- Let's be clear here: as I understand it, none of us wants to see Wikipedia editors prosecuted for "obscene" material. We'd like to have a highly permissive community standard and establish any potential limits of taste by AfD discussions rather than prison sentences. But you're afraid that if we say we'd like a community standard looser than anyone else's, that we'd only increase the risk of actually getting editors prosecuted. Now there's no ambiguity in the other direction - any small-town official can stand up and say he doesn't want pornography in his town and that's a "community standard" that may or may not count in a future court decision. So community standards run only in one direction, toward whatever is most prohibitive, and Wikipedia dare not try to set its own. Is that a valid conclusion to take here? Wnt (talk) 15:57, 23 May 2010 (UTC)
- We should also be mindful that, as pioneers of a new frontier, we are going to get sued by someone for some reason at some point. And when that inevitably happens over our sexual content policy, there's a fair possibility that it's going to make its way to the Supreme Court and become a landmark case with precedent capable of altering the Internet as we know it. Only one thing is for sure: we've got one hell of an important juggling act in our hands right now. — C M B J 10:51, 21 May 2010 (UTC)
- Nobody disagrees with obscenity law more strongly than me - I think it unjustly infringes on free speech, should be protected in all cases by the First Amendment, and that Max Hardcore's imprisonment was a grave miscarriage of justice. I likewise disagree with a lot of copyright laws, but we still have to follow them or risk exposing the WMF and content reusers to legal risk. I'm fine with the present wording, as I felt my previous wording was weak anyway. Dcoetzee (talk) 19:02, 20 May 2010 (UTC)
- I think Wikipedia users deserve better than this. I have at least amended the text you submitted to remove what could be interpreted as voluntary approval of such obscenity standards. I still believe we should make an explicit statement of a community standard favoring freedom. Opposition to censorship has been a fundamental principle of Wikipedia from the day it began. The most censored communities don't hesitate to stand up for their standards — why should we? Wnt (talk) 08:50, 20 May 2010 (UTC)
- I have to agree. This section is verbose, opinionated, and aimed at an audience who will never read it. This policy is for Commons users and administrators, not for law enforcement. What we need to emphasize here is the facts: what obscenity law is, why the situation on Commons is so murky, and what users should pay attention to when evaluating images for obscenity. I'll take a stab at this. Dcoetzee (talk) 04:16, 20 May 2010 (UTC)
- Ugh. I understand what you're trying to do, but I think the section needs to go. The policy we're discussing should be focused specifically on sexual content. Besides, it wouldn't matter either way what our stated policy is "community standards" - if a prosecutor wants to go after us, they will do. In fact, stating that we have a different "community standard" than the norm could be seen as waving a red flag in front of them. Tabercil (talk) 03:31, 20 May 2010 (UTC)
- It seems that's a test of what's lascivious (and even a clothed person being flirtatious can be). I don't see it's relevance except for determining whether a picture of a child is porn or not (which is not what was being discussed). --Simonxag (talk) 01:38, 20 May 2010 (UTC)
Auto-archiving this page
[edit]I think we should set up Miszabot to archive any thread that hasn't been active in 7 days. - Stillwaterising (talk) 23:39, 19 May 2010 (UTC)
- I'd say that 7 d is a bit short - it is annoying if you want to check back a thread you recently read and already have to search the archive. My proposal is 14 d (and maybe 28d if things have settled down more) with no hard agrument for it. Cheers --Saibo (Δ) 00:16, 20 May 2010 (UTC)
- 14 is ok with me. - Stillwaterising (talk) 01:18, 20 May 2010 (UTC)
- I'd prefer 28. And an indexing bot taking a pass as well, if one can be configured to run here. ++Lar: t/c 05:25, 21 May 2010 (UTC)
- 28 days sounds like a fair proposal to me. Seconded. — C M B J 10:56, 21 May 2010 (UTC)
- I added the code for Miszabot with 28d counter. Could somebody please review and tweak if needed. Thx. - Stillwaterising (talk) 15:20, 21 May 2010 (UTC)
- 28 days sounds like a fair proposal to me. Seconded. — C M B J 10:56, 21 May 2010 (UTC)
- I'd prefer 28. And an indexing bot taking a pass as well, if one can be configured to run here. ++Lar: t/c 05:25, 21 May 2010 (UTC)
- 14 is ok with me. - Stillwaterising (talk) 01:18, 20 May 2010 (UTC)
Beastiality and other paraphilias
[edit]User:Stillwaterising added this line to "Prohibited content":
- Actual or simulated sexual activity between humans and animals (zoophilia).
He claims this is "illegal in all states." While I can't argue that many (all?) states in the US have laws against the act of beastiality, I don't know of any laws against the depiction of beastiality, whether actual or simulated, and we have an entire category dedicated to illustrations of it, consisting mostly of classical artwork like "Leda and the Swan" (Category:Zoophilia). To say nothing of the many depictions of human-animal sex on television, including among other things a South Park episode that showed a handicapped child being raped by a shark (no seriously). Dcoetzee (talk) 02:56, 23 May 2010 (UTC)
- Stillwaterising also reverted several small changes with this edit. I object to both the revert itself and the accusative edit summary. "Paraphilia" is an abstract term which, by its very definition, has the same meaning as the three concrete terms that it replaced. The rationale for substitution cited empirical evidence; because two of the specific examples undermine an illustration and and an old artistic depiction already used by Wikipedia articles, a more general term should be used. Moreover, to somehow suggest that such a superfluous literary substitution (e.g., "Dogs" in place of "Border Collie", "German Shepherd", and "Pitbull") could even hypothetically be performed in bad faith is unreasonable. — C M B J 05:09, 23 May 2010 (UTC)
- These sound like questions for Wikimedia's general council, Mike Godwin. - Stillwaterising (talk) 05:32, 23 May 2010 (UTC)
- If you're trying to argue artworks are illegal, I'm pretty sure you're wrong. If you're trying to argue about film and photographs, I suspect that, generally speaking, you're right, though there's probably a few exceptions where simulated could slip through as part of a dramatic work, or one of those gross-out comedies. Adam Cuerden (talk) 07:14, 23 May 2010 (UTC)
- Ok, let's separate the act of having sex with animals, from photographs of same, to graphic representations such as artwork/sculpture/computer animations. From what I understand, anime of children with sexual content became *restricted* with the passage of the Walsh Act in 2005. First person convicted was here, AND this person was also convicted of bestiality, but this individual (understandably) made a plea bargain agreement with the court. Now, zoophilia (bestiality) pornographic images from what I understand, are considered obscene and often prosecuted under Obscenity laws, HOWEVER I can not yet find any hard proof of that, just unreferenced mentions of such. I would challenge somebody to find a zoophilia pornography commercial website that is based in the United States. However, drawings/written depictions/etc. are covered under the First Amendment. And lastly, the actual act of zoophilia is illegal in some states but not in others. In Florida, where some of WMF servers are located, there's no specific law. In California, where WMF is headquartered and registered, there is a law which makes the act a crime (misdemeanor). Also, bestiality pornography has been made illegal in the Netherlands where most of the sites have been hosted, and by this law in the UK which makes it a crime for a citizen of the UK to possess extreme pornographic images, even if they are living out of the country. Given all of the potential trouble with bestiality images, I recommend we limit to historical images and illustrations. - Stillwaterising (talk) 11:21, 23 May 2010 (UTC)
- I don't know much about the Walsh Act or its implications, but obscenity is already addressed quite thoroughly by the proposed policy, regardless of whether it depicts beastiality or not. In particular, while I rarely encounter actual photographs or film of beastiality, I remain skeptical that publication of such photographs (as opposed to production of them) would be in violation of any law except possibly obscenity law (and then only if it manages to fail the Miller test). Law outside of the US is not a reason for excluding useful content - for example we don't comply with the French Article R645-1 banning the display of Nazi symbols - but may be a reason for adding tags to caution content reusers. Also, I strongly prefer to avoid the term zoophilia, which depending on the speaker may refer only to a passive (unfulfilled) sexual desire for animals. Dcoetzee (talk) 11:33, 23 May 2010 (UTC)
- I agree that hosting of these photographs would not violate any law other than obscenity. My mention of the Adam Walsh Child Protection and Safety Act (which deals with strengthened laws for sex offenders) was inaccurate, as was the link for case (now changed to this url). The law in question is PROTECT Act of 2003. BTW, I saw that episode of South Park (Crippled Summer) and I thought it was one of South Park's best episodes. - Stillwaterising (talk) 13:27, 23 May 2010 (UTC)
- I don't know much about the Walsh Act or its implications, but obscenity is already addressed quite thoroughly by the proposed policy, regardless of whether it depicts beastiality or not. In particular, while I rarely encounter actual photographs or film of beastiality, I remain skeptical that publication of such photographs (as opposed to production of them) would be in violation of any law except possibly obscenity law (and then only if it manages to fail the Miller test). Law outside of the US is not a reason for excluding useful content - for example we don't comply with the French Article R645-1 banning the display of Nazi symbols - but may be a reason for adding tags to caution content reusers. Also, I strongly prefer to avoid the term zoophilia, which depending on the speaker may refer only to a passive (unfulfilled) sexual desire for animals. Dcoetzee (talk) 11:33, 23 May 2010 (UTC)
- Ok, let's separate the act of having sex with animals, from photographs of same, to graphic representations such as artwork/sculpture/computer animations. From what I understand, anime of children with sexual content became *restricted* with the passage of the Walsh Act in 2005. First person convicted was here, AND this person was also convicted of bestiality, but this individual (understandably) made a plea bargain agreement with the court. Now, zoophilia (bestiality) pornographic images from what I understand, are considered obscene and often prosecuted under Obscenity laws, HOWEVER I can not yet find any hard proof of that, just unreferenced mentions of such. I would challenge somebody to find a zoophilia pornography commercial website that is based in the United States. However, drawings/written depictions/etc. are covered under the First Amendment. And lastly, the actual act of zoophilia is illegal in some states but not in others. In Florida, where some of WMF servers are located, there's no specific law. In California, where WMF is headquartered and registered, there is a law which makes the act a crime (misdemeanor). Also, bestiality pornography has been made illegal in the Netherlands where most of the sites have been hosted, and by this law in the UK which makes it a crime for a citizen of the UK to possess extreme pornographic images, even if they are living out of the country. Given all of the potential trouble with bestiality images, I recommend we limit to historical images and illustrations. - Stillwaterising (talk) 11:21, 23 May 2010 (UTC)
- If you're trying to argue artworks are illegal, I'm pretty sure you're wrong. If you're trying to argue about film and photographs, I suspect that, generally speaking, you're right, though there's probably a few exceptions where simulated could slip through as part of a dramatic work, or one of those gross-out comedies. Adam Cuerden (talk) 07:14, 23 May 2010 (UTC)
- These sound like questions for Wikimedia's general council, Mike Godwin. - Stillwaterising (talk) 05:32, 23 May 2010 (UTC)
- Just an information: The distribution (not possession) of porn(! - probably not art) containing zoophilia is prohibited also in Germany (see de:Zoophilie#Rechtliches; according to the law § 184a Strafgesetzbuch) - relicts of the last millenium. Cheers --Saibo (Δ) 16:41, 23 May 2010 (UTC)
- Possession of bestiality (zoophilia) images not legal in the UK either (see Commons_talk:Sexual_content#Pornography above and w:Section 63 of the Criminal Justice and Immigration Act 2008). I can't see any reason to allow these images when illustrations are sufficient. - Stillwaterising (talk) 17:37, 24 May 2010 (UTC)
- Policies based on such laws pave way for a slippery slope. The same rationale could then be used against Tianasquare.jpg, Flag of Nazi Germany (1933-1945).svg, and Jyllands-Posten-pg3-article-in-Sept-30-2005-edition-of-KulturWeekend-entitled-Muhammeds-ansigt.png which are not legal in China, Germany, and Iran, respectively. — C M B J 21:08, 24 May 2010 (UTC)
- Hmm, must be Godwin's law, but this has nothing to do with political censorship. Looking through Category:Zoophilia I do not see any images that could be considered off-limits, and while I think this collection is sufficient, it could be added to in the future. What I do not think is appropriate is allowing actual images of human-animal intercourse. - Stillwaterising (talk) 21:23, 24 May 2010 (UTC)
- I hardly see how reductio ad Hitlerum relates to contemporary laws in Austria, Brazil, the Czech Republic, France, Germany, Hungary, Poland, and Russia. But even so, the rationale of "X content must be prohibited because it is illegal in X extrajurisdictional place" does have something to do with political censorship, and will be cited as either supportive precedent or hypocritical behavior in such future discussions. Simply put, this is the wrong way to go about prohibiting content, even if it is unequivocally abhorrent to us. — C M B J 23:04, 24 May 2010 (UTC)
- For lack of a US law regarding actual depictions of beastiality, I think the best policy is to do what we feel is most moral. In this case, the question would be whether distributing depictions of actual beastiality promotes or encourages animal abuse (which I think is a difficult question that I don't know the answer to). If we do accept them, we should have a tag indicating the legal danger to people in other countries. It's clear that illustrations are okay. Dcoetzee (talk) 23:22, 24 May 2010 (UTC)
- I hardly see how reductio ad Hitlerum relates to contemporary laws in Austria, Brazil, the Czech Republic, France, Germany, Hungary, Poland, and Russia. But even so, the rationale of "X content must be prohibited because it is illegal in X extrajurisdictional place" does have something to do with political censorship, and will be cited as either supportive precedent or hypocritical behavior in such future discussions. Simply put, this is the wrong way to go about prohibiting content, even if it is unequivocally abhorrent to us. — C M B J 23:04, 24 May 2010 (UTC)
- Hmm, must be Godwin's law, but this has nothing to do with political censorship. Looking through Category:Zoophilia I do not see any images that could be considered off-limits, and while I think this collection is sufficient, it could be added to in the future. What I do not think is appropriate is allowing actual images of human-animal intercourse. - Stillwaterising (talk) 21:23, 24 May 2010 (UTC)
- Policies based on such laws pave way for a slippery slope. The same rationale could then be used against Tianasquare.jpg, Flag of Nazi Germany (1933-1945).svg, and Jyllands-Posten-pg3-article-in-Sept-30-2005-edition-of-KulturWeekend-entitled-Muhammeds-ansigt.png which are not legal in China, Germany, and Iran, respectively. — C M B J 21:08, 24 May 2010 (UTC)
- Possession of bestiality (zoophilia) images not legal in the UK either (see Commons_talk:Sexual_content#Pornography above and w:Section 63 of the Criminal Justice and Immigration Act 2008). I can't see any reason to allow these images when illustrations are sufficient. - Stillwaterising (talk) 17:37, 24 May 2010 (UTC)
- Just an information: The distribution (not possession) of porn(! - probably not art) containing zoophilia is prohibited also in Germany (see de:Zoophilie#Rechtliches; according to the law § 184a Strafgesetzbuch) - relicts of the last millenium. Cheers --Saibo (Δ) 16:41, 23 May 2010 (UTC)
- I find it hard to see very many ways that bestiality photographs and film would improve the encyclopedia in a way that artworks would not do far better. There might be very rare cases where it might be justified - to document a crime or scandal for Wikinews, say, or (theoretically) as part of a collection by a major photographer, but even then, I find it unlikely. Perhaps we should just say something along that lines?
- Our problem here is that Commons needs to be consistent, because that's what protects us against things like the Muhammad controversy. If we will not allow Muslims to censor images of Muhammad, we can't censor things simply for being offensive or shocking to us: We need to try and do it from Commons' core principles, such as scope.
- That said... God, if we ever did host a photo or film of bestiality, we'd better have damn good educational reasons.
- I'm going to wax a bit long here: Fox's attempt to slime us for hosting artworks seems to have mostly blown over, which was pretty predictable, and would likely have happened far faster if it weren't for Jimbo's attempted coverup. I think that, so long as we can point to a compelling educational value in our controversial images, it would be hard to really attack us and make us stick. Indeed, had Jimbo acted sensibly, he'd have made a press release pointing out the educational value of hosting notable works by French Decadent artists, and this would've been more-or-less over with.
- But our defense depends on educational value. If the images don't have a legitimate educational value, if they're only intended to provoke, shock, and push the boundaries, with no artistic, historical, or educational value, then we're in a very bad situation. Competent behaviour by Jimbo would've resulted in a review of the educational value of our images, and I'm sure that we'd have found some which should've been deleted. I haven't really reviewed everything affected: My "dog in this fight" is having seen notable artworks attacked and deleted, including some from my own field of expertise, engravings, which... well, let's put it this way. Gustave Doré's engravings to Dante's Inferno. I only own a very beaten-up copy of an American edition of the work, but the engravings are all in good condition. They're considered some of the best illustrations produced for that work.
- But the Inferno contains nudity and a lot of extreme violence. Doré depicts that.
- Now, imagine my reaction to seeing Jimbo deleting artworks as pornographic, at the same time as advocating for sadomasochistic works to be included in the list, and refusing to talk about where the line's going to be drawn until he's deleted everything - and that immediately after he had seemingly agreed artworks should be protected, and is now deleting artworks.
- That's pretty much how I got involved with this. Artworks I love, which are considered some of the masterworks of engraving, contain some incidental nudity, and some fairly extreme violence, because that's what the book they're illustrating is like.
- Who knew where it was going to end?
- I think that that's why we need to be careful. It's very hard to rebuild a destroyed collection. If we delete, as Jimbo did, Felicien Rops and Franz von Bayros - notable artists with their own articles on English Wikipedia - can we ever get them back? My experience has shown that we don't have a lot of things you'd think we would in engravings, and have very few contributors of new scans of them. Once we decide artworks are pornography based on Jimbo deciding so unilaterally, even if notable, and even if art critics disagree... well! where would it end? Adam Cuerden (talk) 06:51, 25 May 2010 (UTC)
A velvet divorce?
[edit]My goodness, there's a lot of words here! If this has already been suggested, please disregard, but this is something that has been in my mind ever since this debate in 2007. This was where Commons hosted (and declined to remove) an unauthorized picture of the Wikipedia mascot Wikipe-tan in a soft-porn kiddie cheescake pose. This was in my view a hostile act against Wikipedia by Commons. In the course of the debate, the attitude "we are Commons, we don't give a rat's ass about Wikipedia or what happens to it" was generally expressed.
Which, you know, makes sense, I guess. Wikipedia's mission is to make an encyclopedia. Commons's mission is, I gather, to host basically any image or media that it is physically possible to host. These are two very different missions. The Wikipedias have indeed made use of Commons media, and this relationship has been fun and useful, but isn't really critical to the Wikipedias. The various Wikipedias could host their own media, for instance, and while this would perhaps not be as efficient as the current system, it would not be crippling to the Wikipedias. So I put this out as a tentative proposal:
Would it perhaps be best if Wikimedia Commons was (gradually) spun off into an entity entirely separate from the Wikipedias and other entities of the WMF?
If this occurred, Commons would be entirely free to host anything and everything they like without interference from the WMF. I think this would make the core of Commons, its volunteers, quite happy, which is a good thing. At the same time the WMF would no longer have to worry about being responsible what Commons does. And I'll bet this would inspire Commons to advertise its services more broadly to the many non-WMF entities that could use them.
Obviously Commons would have to create its own organizational structure, find its own sources of funding (after of course a transition period), and probably change its name to something like "World Commons" or "Net Commons" or whatever. None of this should be particularly difficult, and I think that if anything Commons would find its (reduced) funding needs easier to fill than the WMF does.
It's extremely probable that the two entities could continue to work well and closely together, to the extent that the current easy protocol for including Commons media in the Wikipedias could be continued. But if not, so be it.
Anyway, just a thought. Herostratus (talk) 17:54, 23 May 2010 (UTC)
- See COM:SCOPE. Commons really isn't a free photo archive to the world. The situation is that it is so difficult to decide what is definitely out of the project's scope, that it often is not done, except for things people tend to complain about. (Which probably is the most workable outcome anyway). In any case, it would be an unfortunate amount of trouble and expense to separate the two projects, when the issues of freedom and censorship are the same for both. Wnt (talk) 01:11, 24 May 2010 (UTC)
- Agreed. At this point, the notion of the WMF hosting Commons is okay as Commons still has heavy overlap with what the various Wikipedia projects mean. If that ever changes, I expect the WMF would be taking action before the bulk of the people involved with Commons would realize there was an issue. Tabercil (talk) 01:25, 24 May 2010 (UTC)
- I think what ultimately ties Commons to Wikipedia is the technical feature that allows Wikipedians to directly transclude Commons images in articles (there is nothing similar for, say, Flickr images). Hosting everything together on the same set of servers is helpful for ensuring, for example, that downtime of one project does not affect the other adversely. Additionally, Commons is in a number of ways tailored to Wikipedia: for example, we're much stricter about copyright issues than almost any other free media repository on the web. Finally, Jimbo's surrendering of privileges makes me hopeful that there won't be a repeat of the image purge, so I don't see a strong reason for disassociation. Dcoetzee (talk) 01:37, 24 May 2010 (UTC)
Proposed Commons:Sexual content/Legal considerations subpage
[edit]I would like to propose that a subpage (/Legal considerations) be created and the text of this section moved there. I see the current contents of this section to be more along lines of suggestions and should be called guidelines rather than policy. A link to the subpage can be included on the mainpage as well as a brief summary of its contents. - Stillwaterising (talk) 15:27, 26 May 2010 (UTC)
- Done. — C M B J 20:56, 26 May 2010 (UTC)
- I think that any formal guideline would deserve to have at least its own page, rather than a subpage of a policy (which would be confusing). I'm also not sure that anything stated on this page is new policy, rather than interpretation of existing policy. (the original Commons:Sexual content containing the Legal Considerations subsection, that is) Can you point to one thing that requires a "policy" status for this document? Wnt (talk) 21:00, 26 May 2010 (UTC)
- Can somebody please cite some kind of documentation on what makes a policy/guideline and how to proceed please? - Stillwaterising (talk) 21:40, 26 May 2010 (UTC)
- I think that any formal guideline would deserve to have at least its own page, rather than a subpage of a policy (which would be confusing). I'm also not sure that anything stated on this page is new policy, rather than interpretation of existing policy. (the original Commons:Sexual content containing the Legal Considerations subsection, that is) Can you point to one thing that requires a "policy" status for this document? Wnt (talk) 21:00, 26 May 2010 (UTC)
- The advice to the uploader surely does not need status as a policy. What concerns the community and the admins is media that might be illegal to provide, and that is dealt with in Prohibited content.
- One thing that is useful in the policy or in an official guideline is the recommendation to give information to, avoid problems in the future or for certain reusers. I am not sure I am content with how that is dealt with on the page. Anyway it should be clearly separated from info about the uploaders' legal obligations, which may vary depending on local laws and where we do not want to give any promises.
- I object to this change. This section contained specific requirements about how obscene material should be handled (e.g. they cannot be speedy deletion, should be evaluated using the Miller test) which have now been hidden away on a guideline page. This is part of the policy. Dcoetzee (talk) 08:34, 27 May 2010 (UTC)
- Oddly enough, I'm actually not very fond of this idea of saying that obscene material can't be speedy deleted. If you could know that a picture was the one-in-a-billion that would be prosecuted, then speedy deletion might make sense as a protection, and making a policy that it can't be might be seen as one of those things to get Wikimedia in trouble. For some reason you and I were editing that section at about 90 degrees to each other, neither as allies nor true opponents, and we keep missing each others' points somehow. I've made some substantial changes recently, including to the speedy delete section - let's see if we can reach agreement about those points and then we can get back to the disposition of this other section. Wnt (talk) 18:43, 27 May 2010 (UTC)
- We apparently have time for deletion discussions about works that are obvious copyright violations - the penalties for distributing obscenity are (while criminal instead of civil) not really sufficient to justify rapid unilateral action, in my opinion. Dcoetzee (talk) 19:15, 27 May 2010 (UTC)
- But there's COM:SD, which allows for speedy deletions of clear copyright violations. This section is a supplement to COM:SD that evaluates the potential for other legally motivated deletions. Wnt (talk) 17:55, 28 May 2010 (UTC)
"Renamed on sight"
[edit]This policy says that files with non-encyclopedic names should be "renamed on sight" without further explanation. I commented that a file needed to be renamed during an undeletion discussion on May 10,[3] proposed a rename on May 12,[4] and it still has the same name, because the rename requests have a huge backlog. Should we direct users to the existing rename request process, and warn them that there's a wait involved, or should we invent a new process (and who will do it?). Wnt (talk) 21:12, 26 May 2010 (UTC)
- An admin can do it very quickly, users, not so much. I'd say encourage admins to rename on sight (reminding them to use CommonsDelinker to move any usages to the new filename), everyone else, use the slow rename process. Adam Cuerden (talk) 22:43, 26 May 2010 (UTC)
- Perhaps the renaming template could be modified to include an optional priority parameter? — C M B J 00:14, 27 May 2010 (UTC)
- One suspects that'd be abused more often than it's used appropriately. Adam Cuerden (talk) 09:01, 27 May 2010 (UTC)
- Perhaps the renaming template could be modified to include an optional priority parameter? — C M B J 00:14, 27 May 2010 (UTC)
- Possibly yes, but is that a real problem? If some files get renamed unfairly quickly, that is no big deal. The users can be warned and dealt with as in the case of any abuse. On the other hand, as long as the parameter is not abused very much it will get the appropriately prioritised files dealt with much quicker. For some files staying in the backlog for a few years is ok. If priority=low is used as much as priority=high is abused, then there even is no difference for the "normal" renames. A good documentation will help. --LPfi (talk) 09:31, 27 May 2010 (UTC)
- True. If someone wants to write some basic guidelines, I'll gladly add the functionality and announce it. Altrhough perhaps a better plan might be to make a header for Commons:AN that automatically listed infomrmation about various admin tasks. Want to try that first? Adam Cuerden (talk) 11:01, 27 May 2010 (UTC)
- I've started a thread at Commons talk:File renaming#Sexual content, which is an existing guideline that currently lists six accepted reasons for a rename. Sexual content would make the seventh. I've also invited comment on certain aspects of the wording of the guideline there (i.e. is it only new files to be renamed?), and mentioned the priority proposal. Wnt (talk) 15:55, 27 May 2010 (UTC)
- I think this advice should ideally appear in both policies/guidelines with links between the two. This is not just a policy but also a reference of relevant related policy. Dcoetzee (talk) 18:08, 27 May 2010 (UTC)
- I've started a thread at Commons talk:File renaming#Sexual content, which is an existing guideline that currently lists six accepted reasons for a rename. Sexual content would make the seventh. I've also invited comment on certain aspects of the wording of the guideline there (i.e. is it only new files to be renamed?), and mentioned the priority proposal. Wnt (talk) 15:55, 27 May 2010 (UTC)
- True. If someone wants to write some basic guidelines, I'll gladly add the functionality and announce it. Altrhough perhaps a better plan might be to make a header for Commons:AN that automatically listed infomrmation about various admin tasks. Want to try that first? Adam Cuerden (talk) 11:01, 27 May 2010 (UTC)
- Possibly yes, but is that a real problem? If some files get renamed unfairly quickly, that is no big deal. The users can be warned and dealt with as in the case of any abuse. On the other hand, as long as the parameter is not abused very much it will get the appropriately prioritised files dealt with much quicker. For some files staying in the backlog for a few years is ok. If priority=low is used as much as priority=high is abused, then there even is no difference for the "normal" renames. A good documentation will help. --LPfi (talk) 09:31, 27 May 2010 (UTC)
Media uploaded without consent
[edit]The section about content uploaded without the subjects' consent needs improving. One problem is the phrase "will be speedily deleted", which is very unclear. Many people will read that as speedy deletion, while others will not.
I think such content should be speedy deleted on request if there aren't any special considerations. If the file is important we might hope that somebody notices the speedy deletion and asks for undeletion. Vandals asking for deletion of important files can be dealt with by the "special considerations" clause.
On the other hand files that have been on the net for a long time before an editor starts to think that they might have been uploaded without consent should go through the full deletion request process. Otherwise we will have some admin thinking that every anonymous image must go and making a mess.
The other problem with the section is that we have old pictures for which we will not get any statement of consent. Many of those are harmless. Either the subject is dead long ago or the image does not "unreasonably intrude into the subject's private or family life". I rewrote that part but think it still should be improved.
--LPfi (talk) 09:19, 27 May 2010 (UTC)
"Old" works
[edit]This part: "Old works and media of historic significance might be kept without the consent of the subjects, in special cases even when the subjects ask for deletion." gives me a bit of the heebie-jeebies. While it's clear from reading Commons:Photographs of identifiable persons that there is a snake's nest of local censorship laws being applied, I think it would be best to leave all of that guideline to itself, and if we were going to pick any particular part of it to repeat on our own, I wouldn't make it the part about keeping sex pictures of an old lady online when she is furiously arguing against it. Wnt (talk) 17:30, 27 May 2010 (UTC)
- I think the main concern here is that if we publish (say) a famous public domain artistic photograph of a nude woman, we shouldn't delete it just because she doesn't want it spread around. Realistically the chances of this occurring are very low and can be dealt with on a case-by-case basis. Dcoetzee (talk) 18:38, 27 May 2010 (UTC)
- I understand this concern, and I agree that such cases exist. After just recently seeing it, I also believe that the current Commons:Photographs of identifiable persons is alarmingly over-restrictive, imposing the laws of three countries and a succession of "moral rights" against the entire database, and if vigorously applied likely could impose far worse censorship (against a much wider variety of things) than any version of this proposal made so far all the way back to the Jimbo ad hoc deletions. But I also don't want to get into a looking-glass dispute where inconsistent guidelines make it easier to justify keeping sexual content than non-sexual content. I want to keep policy and guidelines absolutely modular, with all the rules about a topic in one place, and no running back and forth between policies. I've seen a case of this on Wikipedia between w:WP:BLP and w:WP:Attack page, and it's just infuriating to argue according to one policy that something belongs only to get pointed to another policy that says it doesn't. Wnt (talk) 19:08, 27 May 2010 (UTC)
- I agree that consistency is important - however I think policy should also generally act as a summary of other related policy as it applies to this policy, with links. This is particularly helpful for visitors from local wikis who are not familiar with all our policies. Dcoetzee (talk) 19:12, 27 May 2010 (UTC)
- That's true. But we should always make clear which policy is in the driver's seat. In this case I provided a sort of summary just by listing all the different things they came up with in that other policy, but summarizing it properly would take a tremendous amount of space. And as I said, I wouldn't pick that one bit to be the thing we repeat here. Wnt (talk) 19:19, 27 May 2010 (UTC)
- I agree that consistency is important - however I think policy should also generally act as a summary of other related policy as it applies to this policy, with links. This is particularly helpful for visitors from local wikis who are not familiar with all our policies. Dcoetzee (talk) 19:12, 27 May 2010 (UTC)
- I understand this concern, and I agree that such cases exist. After just recently seeing it, I also believe that the current Commons:Photographs of identifiable persons is alarmingly over-restrictive, imposing the laws of three countries and a succession of "moral rights" against the entire database, and if vigorously applied likely could impose far worse censorship (against a much wider variety of things) than any version of this proposal made so far all the way back to the Jimbo ad hoc deletions. But I also don't want to get into a looking-glass dispute where inconsistent guidelines make it easier to justify keeping sexual content than non-sexual content. I want to keep policy and guidelines absolutely modular, with all the rules about a topic in one place, and no running back and forth between policies. I've seen a case of this on Wikipedia between w:WP:BLP and w:WP:Attack page, and it's just infuriating to argue according to one policy that something belongs only to get pointed to another policy that says it doesn't. Wnt (talk) 19:08, 27 May 2010 (UTC)
"Pornographic depictions of X"
[edit]I've already tinkered recently with the section about categories, removing a hypothetical about "pornographic depictions of Saint Theresa" and sticking with the real example and "Caricatures of Saint Theresa". But I'd like to check whether people would agree to dropping altogether the suggestion to use "Pornographic depictions of X" as a category name.
My concern is that this is a highly contentious and probably personal judgment by an editor regarding the nature of the material, serving also as a direct confession to some that Wikipedia is hosting "pornography" per se.
There may also be some need to figure out what the philosophical goal is here. I think the idea of the "pornographic depictions of X" may have been to warn users of indecent content, whereas the idea of the example given is to avoid confusion with general purpose photos of the subject.
There is a Commons:Categories. Though it is labeled as a help file rather than a guideline, it seems to be where issues of what categories should be made have been handled. I think it may be desirable to get input there about how to handle these issues, since categorization is an art that takes a certain knack to get right. Wnt (talk) 16:13, 27 May 2010 (UTC)
- It's a fair point. I rather quickly rewrote that section basing it on a revised version of a section of Jimbo's, which wasn't that great to start with. Delete anything in there you feel lacks consensus. Adam Cuerden (talk) 17:53, 27 May 2010 (UTC)
- As stated on the Village Pump, I agree that we should not be categorizing anything on the basis of "pornographic depictions of," simply because "pornographic" is a subjective term - we should categorize according to content, not offensiveness. Re your statement "a direct confession that Wikipedia is hosting pornograpy": we should be (and are) hosting what is generally understood to be pornography, since we have many articles on pornographic topics. Dcoetzee (talk) 18:05, 27 May 2010 (UTC)
- Well, if they're written according to NPOV, they're pornographic according to a source, not according to us. But that's a quibble. ;) In any case I've rewritten and substantially reduced this section. I've also removed the mention of the specific image in favor of the general category, just in case it gets deleted as out of scope and because it may not be technically sexual content as currently defined here. Wnt (talk) 18:50, 27 May 2010 (UTC)
Non-human animals
[edit]I just deleted the stuff in the definition about non-human animals. (I'm not speaking of bestiality, which is another issue, but just animal matings) The reason for this is that animals are not covered under any of the three categories of prohibited content. The only point that would apply to them is that you should name the file "Monkeys mating" and not "monkeys screwing", but that is an issue I'd like to see settled under Commons:File renaming, a different guideline, as I described above; and there the proposed seventh category of renamings could have a different scope. Wnt (talk) 17:59, 27 May 2010 (UTC)
- Agreed. In the absence of legal prohibitions, there is no need to exclude content in this area. Issues of description and scope may still apply, as they always do. Dcoetzee (talk) 18:07, 27 May 2010 (UTC)
- Who added non-human animals?
- User:Jmabel at 17:02, 9 May 2010. Dcoetzee (talk) 18:36, 27 May 2010 (UTC)
- Probably a Jimboism. I recall something like that in his draft. Oh, well! =) Adam Cuerden (talk) 18:48, 27 May 2010 (UTC)
- Animals mating should categorized appropriately, perhaps even mentioned here in some way. There's no issues that I know of and most of it should be within scope. Perhaps COM:NUDE could be applied to help "prune" lower quality images etc. - Stillwaterising (talk) 01:43, 31 May 2010 (UTC)
- I haven't really looked, but 99% of the images I've seen of that type have been things like bees, slugs, and so on. I think that artificial insemination - which involves humans, of course, and particularly the collection phase - is about the only encyclopedic material involving animal sex we'd really need to be careful about categorizing, naming, describing, and so on. Adam Cuerden (talk) 01:57, 31 May 2010 (UTC)
- Animals mating should categorized appropriately, perhaps even mentioned here in some way. There's no issues that I know of and most of it should be within scope. Perhaps COM:NUDE could be applied to help "prune" lower quality images etc. - Stillwaterising (talk) 01:43, 31 May 2010 (UTC)
- Probably a Jimboism. I recall something like that in his draft. Oh, well! =) Adam Cuerden (talk) 18:48, 27 May 2010 (UTC)
- User:Jmabel at 17:02, 9 May 2010. Dcoetzee (talk) 18:36, 27 May 2010 (UTC)
- Who added non-human animals?
Userpage policy revisited
[edit]Despite repeated proposals, Wikipedia consensus has, to the best of my knowledge, thoroughly rejected the idea of prohibiting sexual content on userpages. This subsection should be reconsidered before we put the policy up for a vote. — C M B J 21:03, 26 May 2010 (UTC)
- I agree that this section should stay out. The underlying issue is that Wikipedia has a very strict policy on userpages which (according to previous discussion) is apparently not policy on Commons; in fact I think there is no policy on userpages on Commons. If such a policy exists or is proposed, then this would be a point for that. But the userpage per se contains no images; it contains at most a list of images, and so it is not "Sexual content" as defined in the first section! (See also Commons:Deletion requests/User:Max Rebo Band). Wnt (talk) 21:20, 26 May 2010 (UTC)
- I think you are arguing is that procedure calls for images to be displayed (internal links) are, in some way, protected speech. What the code on the userpage does is call the image to be displayed, which is inserting/displaying images on a userpage, not "text", not "a list" (unless it really is a list instead of a gallery), and not "protected speech". - Stillwaterising (talk) 21:39, 26 May 2010 (UTC)
- I don't understand the relevance of "protected speech" here. Sexual content appearing in a user page gallery is presumably content people have chosen not to delete, so none of it should be anywhere near being illegal. And if it were the point to address would be the image, not one page in which it is referenced.
- I also don't see your distinction between the list and the gallery. With something like w:WP:POPUPS a list is something like a gallery, as the links expand on mouseover. It's all a matter of software interpretation. But the pictures shown are only the pictures that Wikipedia serves no matter what document they're in.
- Suppose user page galleries are banned. Then are editors allowed to have one sentence, "See my Wikiporn gallery at my homepage on Encyclopedia Dramatica [link]" - with links back from thumbnails on ED to Wikimedia Commons images? If yes, you've gained nothing but embarrassment; if no, then the policy you're describing is a total ban on links to any "pornographic" site from the userpage. Wnt (talk) 03:36, 27 May 2010 (UTC)
- I think you are arguing is that procedure calls for images to be displayed (internal links) are, in some way, protected speech. What the code on the userpage does is call the image to be displayed, which is inserting/displaying images on a userpage, not "text", not "a list" (unless it really is a list instead of a gallery), and not "protected speech". - Stillwaterising (talk) 21:39, 26 May 2010 (UTC)
- I think the problem is getting to disturbing images without warning, like in the case of categories ("People eating"). I think a user should be polite by not upsetting people visiting the user page without a good reason. Sexual content does surely upset a lot of people and I see no good reason to have such a gallery directly on the user page. I see no problem in having the gallery on a subpage, with a link from the user page properly stating its nature.
- There are other categories of upsetting content that might be put on user pages, so a policy about user pages would be good to have, but I think that sexual content on user pages can be handled here until there is such a general policy.
- If someone's going to be upset, won't they be upset on the first image? If someone goes through the whole gallery before getting upset, I find myself skeptical of his aversions. Also, I'm not sure why a subpage is different; nor does the proposed wording say that subpages are OK. Wnt (talk) 16:17, 27 May 2010 (UTC)
- Yes. I do not think it is polite to have even one sexually explicit image directly on the user page. User:Max Rebo Band has only one "tasteful nude" visible (with my settings), which I find borderline. Nudity is not covered by this policy, but you might get my point.
- I think you should be able to click on the signature of any user or otherwise visit their page without having to be upset by chocking pictures. Even if you are a child! (Or from a culture were sex is taboo.) Subpages with suitable names and with suitable links to them is not a problem, as you can choose to avoid them, without harm to your ability to communicate and act on Commons on questions not related to such content.
- The userpage wording should be changed. Allowing excessive sexual content in some cases does not feel right.
I've added a little bit of the discussions above to this. Feel free to delete it (but, if you revert, note that the possessive of "Commons" is "Commons'", not "Common's": We are not Wikimedia Common; Commons is used here in the sense of communal property, which anyone may use to their ends.) I think it's reasonable to suggest good practices, so long as we don't force them on people. Adam Cuerden (talk) 18:44, 27 May 2010 (UTC)
- Since it doesn't sound like I'm getting consensus on this, I've taken a stab at writing up what I think represents the current status quo. Wnt (talk) 21:44, 27 May 2010 (UTC)
- I've removed it, since I'm afraid, for practical purposes, it pretty much boiled down to "Feel free to edit war over it". If we can't come to an agreement on it, we may as well save that debate for a later day, so that we can move forwards with the rest of the policy. Once it's approved, we can discuss user pages as long as it takes, or make a guideline about them. Adam Cuerden (talk) 21:52, 27 May 2010 (UTC)
- I agree with Adam on this. The userpage debate is probably best left as a separate effort. TheDJ (talk) 21:57, 27 May 2010 (UTC)
- I have no disagreement with Adam's recent edits. I will say though that I think that is the status quo, ugly as it is. We have an authentic state of nature here, and interestingly enough so far it has been virtually all John Locke and no Thomas Hobbes. If discord develops, it may lead to a social contract and civil society —— for better or for worse. Wnt (talk) 22:14, 27 May 2010 (UTC)
- While I agree that is the status quo, I think that pointing it out probably won't help the rest of the policy to pass, as it may be seen as an encouragement for such a state to continue. Adam Cuerden (talk) 22:23, 27 May 2010 (UTC)
- I also agree with your points. Simplifying the text will help the discussion. --SJ+ 12:06, 28 May 2010 (UTC)
- While I agree that is the status quo, I think that pointing it out probably won't help the rest of the policy to pass, as it may be seen as an encouragement for such a state to continue. Adam Cuerden (talk) 22:23, 27 May 2010 (UTC)
- I have no disagreement with Adam's recent edits. I will say though that I think that is the status quo, ugly as it is. We have an authentic state of nature here, and interestingly enough so far it has been virtually all John Locke and no Thomas Hobbes. If discord develops, it may lead to a social contract and civil society —— for better or for worse. Wnt (talk) 22:14, 27 May 2010 (UTC)
- I agree with Adam on this. The userpage debate is probably best left as a separate effort. TheDJ (talk) 21:57, 27 May 2010 (UTC)
- I've removed it, since I'm afraid, for practical purposes, it pretty much boiled down to "Feel free to edit war over it". If we can't come to an agreement on it, we may as well save that debate for a later day, so that we can move forwards with the rest of the policy. Once it's approved, we can discuss user pages as long as it takes, or make a guideline about them. Adam Cuerden (talk) 21:52, 27 May 2010 (UTC)
- Alright, I owe some people here an apology, because there is a fairly long-standing policy on user pages—it's just buried into a subfile of the project scope (Commons:Project scope/Pages, galleries and categories) which somehow I missed before (though I don't think I'm the only one). It says that non-allowable content on userpages includes "Private image or other file collections of no wider educational value". Needless to say I'm not fond of this policy but it does exist. Even so, I think its relevance here is peripheral because it still isn't sexual content itself, only presentation of such; and I think that a user page gallery could fairly easily gain education value with appropriate commentary. Wnt (talk) 05:44, 30 May 2010 (UTC)
Archiving
[edit]I'm going to archive from "In photo description" up. We don't need all the Jimbo incident stuff anymore, and nothing above that's active, as far as I can tell. Just pull any threads you still wanted back out of the archive =) Adam Cuerden (talk) 20:47, 27 May 2010 (UTC)
- Hmmm.. Actually, we agreed on 28d and a bot will archive the page: [5] --Saibo (Δ) 21:45, 27 May 2010 (UTC)
- Well, it was getting excessively long, and nothing from the Jimbo period is really useful to us now. We can use the bot going forwards, though note we really should archive the talk page when we put up the sitenotice to get this made policy. Adam Cuerden (talk) 21:47, 27 May 2010 (UTC)
- The problem with any manual (human) archiving is that there is always the possibility of the human introducing bias into the conversation by archiving selective threads. Some of the treads archived were last modified almost 3 weeks ago, some just 7 days ago and still relevant. I proposed originally that Miszabot be set to 14 days and I think the manual archiving should be reverted and while the "too long" argument is valid, perhaps a simple tweak or just the passage of time will solve this amicably. - Stillwaterising (talk) 14:40, 28 May 2010 (UTC)
- Well, I tried to avoid bias by going from abovea point, but if there's any particularly relevant threads, just pull them back out. Adam Cuerden (talk) 02:02, 31 May 2010 (UTC)
- The problem with any manual (human) archiving is that there is always the possibility of the human introducing bias into the conversation by archiving selective threads. Some of the treads archived were last modified almost 3 weeks ago, some just 7 days ago and still relevant. I proposed originally that Miszabot be set to 14 days and I think the manual archiving should be reverted and while the "too long" argument is valid, perhaps a simple tweak or just the passage of time will solve this amicably. - Stillwaterising (talk) 14:40, 28 May 2010 (UTC)
- Well, it was getting excessively long, and nothing from the Jimbo period is really useful to us now. We can use the bot going forwards, though note we really should archive the talk page when we put up the sitenotice to get this made policy. Adam Cuerden (talk) 21:47, 27 May 2010 (UTC)
I'm setting the algo to 10d. This is a very involved discussion and this page is too long. Also, the voting page should be a subpage so this discussion can continue. - Stillwaterising (talk) 11:18, 6 June 2010 (UTC)
Moving forward on adoption
[edit]Hi all, I think the proposal as it stands today has stabilized. I think it's about time to solicit wider feedback and possibly to begin a straw poll on whether to adopt this as policy. Would anyone like to voice any remaining concerns before we do so? Dcoetzee (talk) 19:06, 25 May 2010 (UTC)
- The "and we may be compelled to remove some works on this basis." line should probably be rephrased, so as to prevent a policy-justified repeat of what just happened. — C M B J 21:01, 25 May 2010 (UTC)
- I rephrased to emphasize that speedy deletion for obscenity is not permitted - I think the Miller test is too complex for a single administrator to evaluate, and unlike issues of privacy or child pornography, it poses no immediate threat to anyone. Dcoetzee (talk) 21:08, 25 May 2010 (UTC)
- I'm happy to put a notice in the sitenotice, when it's agreed we're ready. Adam Cuerden (talk) 22:52, 25 May 2010 (UTC)
- This looks stable now. Perhaps it would be useful to invite the active contributors to this page over the past few weeks to take another look first. --SJ+ 12:11, 28 May 2010 (UTC)
- I think there are still two major issues that need to be resolved first:
- The text describes a proposed policy. Before people can vote on it, they need to read it as it would actually be written, i.e. not talking about proposals. Also there's considerable question in my mind whether this is a new policy, or a new guideline, or (in my view) a "supplement" of the type described in w:Template:Supplement.
- I still have considerable objections to how my section on our community standard has been rewritten by Dcoetzee. My original intent was to make clear that anyone is free to join the Wikipedia community and comment on the content we keep; that whatever boundaries we choose to draw or not to draw can be done by us by civilized discussion; that none of us want editors prosecuted for what they contribute; and thus that our deletion processes do not represent a "community standard" that someone should be imprisoned for breaking. Every time Dcoetzee messes with the text it sounds like it's saying that Wikipedia will cheerfully accept whatever "community standard" of restrictions the courts throw at it; that we will use our deletion discussions as an opportunity to mock-convict editors of uploading obscene material and render evidence that a community finds it offensive and without redeeming features; and that we have nothing to say about the thought of someone prosecuting our people. I don't think that is the way people here feel at all. I think even those who would approve of censorship by us to make Wikipedia somehow "family friendly" still would recognize that we don't need external prosecutors singling people out for prison time. Local discussion and ordinary deletion is the only effective way to enforce any line in the sand on content, and it is a way that doesn't have anywhere near such a drastic harmful effect on freedom of expression or the willingness of contributors to participate. To give the opposite impression solely out of cowardice - out of the fear that if we say we like freedom someone is going to stomp our people down - that is not the philosophy that Martin Luther King, Jr. taught us. Wnt (talk) 23:48, 25 May 2010 (UTC)
- I think there are still two major issues that need to be resolved first:
- I think this is best to state that this is a policy and not a supplement - we need this to have some teeth to it in order to help prevent a recurrence of the mess in the first place. As for the Wikipedia community standard versus a court-imposed standard, I'm sorry but the courts will win that battle every time. And do you know what happens to the losers? It's called "jail time". And I'm in agreement with Adam Cuerden in that I'm satisfied with this; the main points which I wanted to see in this policy got put in place a while back so my concerns are met. Tabercil (talk) 03:24, 26 May 2010 (UTC)
- I tried to make it clear that any kind of deletion due to obscenity should be a rare event that is conducted only for works that are actually illegal to distribute, in the jurisdiction of the servers. Nor is it intended to "single out contributors for prison time", but rather to isolate media that may be illegal for WMF to distribute, so we can, you know, cease distributing it. Lots of people upload copyvios but we're not trying to report them and get them arrested either. It's the same deal here. I'll try to reword this to make this clearer. Dcoetzee (talk) 06:13, 26 May 2010 (UTC)
- I'm glad this topic has come up because I do believe this proposed policy/guideline is nearly ready to go. My understanding was that this proposal is to become policy. I support an announcement at MediaWiki:Sitenotice and as well as Village Pump.
- From what I understand, our responsibility for reporting illegal sexually explicit content is outlined in 18 USC 2258A "Reporting requirements of electronic communication service providers and remote computing service providers" and is limited to apparent violations of child pornography laws. This is not limited to actual images, but under 18 USC 1466A includes "a visual depiction of any kind, including a drawing, cartoon, sculpture, or painting" that either depicts a minor engaged in sexually explicit conduct and is obscene. There is an exception for images that aren't "obscene" and have "serious literary, artistic, political, or scientific value." All of this involves legal nuances that are way over my head and are best interpreted by Wikimedia's legal team. I have asked Mike Godwin to review this proposal and he has indicated the he would prefer that such a request be made through consensus decision rather from a lone editor. - Stillwaterising (talk) 14:33, 26 May 2010 (UTC)
- Who is this "our" you are talking about? I'm neither an "electronic communication service provider" nor a "remote computing service provider", and I doubt anyone else here is, either. As such, my legal responsibility is exactly nil. --Carnildo (talk) 21:44, 26 May 2010 (UTC)
- Legal responsibility for reporting lies with WMF not the user. - Stillwaterising (talk) 03:03, 27 May 2010 (UTC)
- Who is this "our" you are talking about? I'm neither an "electronic communication service provider" nor a "remote computing service provider", and I doubt anyone else here is, either. As such, my legal responsibility is exactly nil. --Carnildo (talk) 21:44, 26 May 2010 (UTC)
- I tried to make it clear that any kind of deletion due to obscenity should be a rare event that is conducted only for works that are actually illegal to distribute, in the jurisdiction of the servers. Nor is it intended to "single out contributors for prison time", but rather to isolate media that may be illegal for WMF to distribute, so we can, you know, cease distributing it. Lots of people upload copyvios but we're not trying to report them and get them arrested either. It's the same deal here. I'll try to reword this to make this clearer. Dcoetzee (talk) 06:13, 26 May 2010 (UTC)
- To follow up, I've radically changed much of the first two paragraphs to make it read more like a policy/guideline/supplement than a proposal to create one. I also transwikied (with some loss and breakage) the Template:Supplement from en.wikipedia, allowing this to be labeled as a supplement if so agreed. Wnt (talk) 21:25, 27 May 2010 (UTC)
- Please don't go don the "History of Jimbo's deletion" route for the opening: If we do that, and particularly with the defense of some of the actions, this is going to get rejected before anyone even reads it. This policy has been completely rewritten since those actions. Let's draw a line, and separate ourselves. Adam Cuerden (talk) 22:11, 27 May 2010 (UTC)
- Above is Template:Supplement. My thoughts were that this needs to be policy because it conflicts with existing guidelines. I had also expected support from the Board of Trustees and general council. As a guideline, it will likely be watered down, rejected/, and/or ignored. As an supplement (essay) it will almost certainly be ignored and efforts spent in this wasted. - Stillwaterising (talk) 00:40, 28 May 2010 (UTC)
- Which part(s) of this document do you view as being in conflict with existing guidelines? — C M B J 00:58, 28 May 2010 (UTC)
- I wouldn't worry about it not being enforced... after all, it was enforced before it was written. And I've seen essays enforced on Wikipedia, whether that makes sense or not. Besides, the main purpose of everything that has gotten consensus has not been to make new policy, but simply to give the users the tools they need to enforce the existing policies. The reason why Jimbo was able to make so many deletions that stuck was that the policies were essentially ignored before this (and there are still quite a few files of every variety but sexual content that still are out of compliance with them). Wnt (talk) 02:33, 28 May 2010 (UTC)
- I personally strongly support adopting this as a full policy - not in order to support deletion where it is warranted, but in order to "outlaw" unilateral deletion of the sort we saw during the purge. It's nice when I'm admonishing someone for abusing the concept of scope to have a clear policy to point them to. Dcoetzee (talk) 04:32, 28 May 2010 (UTC)
- Full policy would be best. TheDJ (talk) 11:00, 28 May 2010 (UTC)
- Since Jimbo Wales came out of that round of speedy deletions with reduced privileges, do you really need to make this a policy just to add a little "oomph"?
- The reason I don't want this to be a full policy is that Commons policies should be designed to let us work productively on anything. That's why I keep trying to split this up - if there's a renaming problem we need a better rename; if there's a userpage problem we should consider whether we need a userpage policy; if the issue is with various weird rights on photos like "personality rights" we need one place to consider that. But once an issue with a Wiki feature or a file type is identified, a good policy should handle it wisely throughout the site. We should not have one policy for sexual content, another for nudity, another for pictures of war atrocities, another for pictures of animals being maltreated etcetera. You could end up with hundreds of policies, and with every type of content being regulated differently there would be all sorts of bureaucratic restrictions and squabbles over what can be shown for what kind of thing. You'd constantly be finding out that three months ago some little discussion led to the prohibition of some kind of photo or other. Wnt (talk) 14:57, 28 May 2010 (UTC)
- First of all, sexual content is unique in that there are a number of legal requirements that apply specifically to sexual content and not other kinds of content, like child pornography and obscenity law and 2257. if someone says "delete this because it doesn't satisfy 2257 record keeping requirements," how are we to indicate that our official stance is that we have no such requirement? That is why I move for the "legal considerations" subsection to be moved back into the main page and for this page to be adopted as official policy. Dcoetzee (talk) 16:33, 28 May 2010 (UTC)
- I don't think there was much consensus to move that section, and I don't object if you move it back. But there's still a lot about it that is hugely confusing (see below). Wnt (talk) 18:27, 28 May 2010 (UTC)
- First of all, sexual content is unique in that there are a number of legal requirements that apply specifically to sexual content and not other kinds of content, like child pornography and obscenity law and 2257. if someone says "delete this because it doesn't satisfy 2257 record keeping requirements," how are we to indicate that our official stance is that we have no such requirement? That is why I move for the "legal considerations" subsection to be moved back into the main page and for this page to be adopted as official policy. Dcoetzee (talk) 16:33, 28 May 2010 (UTC)
- Full policy would be best. TheDJ (talk) 11:00, 28 May 2010 (UTC)
- I personally strongly support adopting this as a full policy - not in order to support deletion where it is warranted, but in order to "outlaw" unilateral deletion of the sort we saw during the purge. It's nice when I'm admonishing someone for abusing the concept of scope to have a clear policy to point them to. Dcoetzee (talk) 04:32, 28 May 2010 (UTC)
- I wouldn't worry about it not being enforced... after all, it was enforced before it was written. And I've seen essays enforced on Wikipedia, whether that makes sense or not. Besides, the main purpose of everything that has gotten consensus has not been to make new policy, but simply to give the users the tools they need to enforce the existing policies. The reason why Jimbo was able to make so many deletions that stuck was that the policies were essentially ignored before this (and there are still quite a few files of every variety but sexual content that still are out of compliance with them). Wnt (talk) 02:33, 28 May 2010 (UTC)
- Which part(s) of this document do you view as being in conflict with existing guidelines? — C M B J 00:58, 28 May 2010 (UTC)
- As I understand it, policies are things that almost never need exceptions, and guidelines things that very rarely need exceptions (not that that's very strictly enforced - en:WP:CIVIL is a policy, when it's really more of a guideline, due to politicking). Which is this? Adam Cuerden (talk) 17:16, 28 May 2010 (UTC)
- I've been away from this discussion for ten days, and, coming back, like what I see. Good work by you guys. --JN466 16:10, 29 May 2010 (UTC)
Should legality be figured by users or by office actions?
[edit]I just went through a whole bunch of Commons policies trying to figure out which one concerned censorship laws, so that I could say indicate that the section about child pornography etc. was a supplement to that. But I don't actually see such a section. So I have to wonder: is the status quo to leave law to the lawyers, i.e. to call up the WMF directly if you think you see something illegal? Because when I read the "legal issues" subsection as it's been worked over, complete with participants at a deletion discussion trying to work a Miller test, I have to wonder if it is really viable to have random Wikipedians figuring out legalities. There's a whole separate mechanism of w:WP:Office actions that may be better suited to the task, if needed. Should we be trying to be amateur lawyers? Wnt (talk) 18:25, 28 May 2010 (UTC)
- I started a related discussion at the Village Pump regarding a potential Commons:Office actions. [6] We need to know for sure what policies we have before we make new ones. Wnt (talk) 18:52, 28 May 2010 (UTC)
- We routinely put on our amateur lawyer hats to evaluate copyright, which is a great deal more complex than US obscenity law - we can and have asked Mike to comment on interesting ambiguous cases, but he doesn't have the time or resources to look at every case. Child pornography is a special case because of its particular urgency and severe penalties, but I don't see any particular reason why any other legal concerns can't be addressed by the community. Besides all that, the main point of asking for a Miller analysis is to make the person nominating an image for deletion for "omg obscenity" to stop and think about whether it's really obscenity (most likely, it isn't). Dcoetzee (talk) 21:14, 28 May 2010 (UTC)
- Well, I've created COM:OFFICE and put it into the category of Commons policies, based on the text of the linked policy which says it applies to all Wikimedia projects and a handful of comments at the Village Pump. Though it's tenuous at this point, I think we can start revising the sections about legal matters to recognize that WMF does handle these issues. I'm not saying we might not still end up with some "abundance of caution" processes in here, but we should reassure readers that there's a responsible adult somewhere in the building. Wnt (talk) 21:38, 28 May 2010 (UTC)
Speedy or no speedy
[edit]As it is written now it says that images that are out of scope are "prohibited". I think that that is a wrong way of saying it because prohibited indicates that it is illegal. It is much more wors if an user uploads child porn than if an user uploads images that is out of scope. Scope can be discussed - child porn can not. Therefore I suggest that illegal is one headline and scope is another one.
Because scope can be discussed I suggest that the rule is DR and speedy is an exception that can only be used in clear cases. History has shown that sexual content can really make users disagree and therefore it is better to discuss unless it is very clear. No need to risk a new "war" and desysopping of more admins because we allow speedy deletions related to scope. In 6 months or a year when things have been tested by time then it is perhaps time to speedy.
Depending on the number of images we could perhaps find a way to make sure that images that might be illegal do not end up together with the other DR's and stay on Commons for months before someone finds and closes the DR. --MGA73 (talk) 19:44, 28 May 2010 (UTC)
- To me the word "prohibited" does not imply "illegal." Speedy deletion of uncontroversial out of scope images is routine and essential - it would be a substantial change to standing practice to prohibit this. Dcoetzee (talk) 21:10, 28 May 2010 (UTC)
- COM:SD and Commons:Deletion_policy#Speedy_deletion does not list scope as a valid speedy deletion reason. If you read Commons:Deletion_policy#Regular_deletion you will find this "The file is not realistically useful for an educational purpose." So our policy is very clear. I agree that some admins speedy delete without following our policy but that does not mean that it is ok. If we want scope to be a speedy then we should change policy to match that. --MGA73 (talk) 10:42, 29 May 2010 (UTC)
- To me it looks like scope is listed as a reason in Commons:Deletion policy, since April 2008.[7] Wnt (talk) 17:40, 29 May 2010 (UTC)
- Yes as a Regular deletion. A regular deletion means that you have to nominate the file for deletion and then other users can comment and after 7 (?) days some admin decides. :-) --MGA73 (talk) 20:59, 29 May 2010 (UTC)
- Hmmm... it loooks like you're right. I was referring to Commons:Deletion_policy#Speedy_deletion (search for the next instance of "scope"). This is under the speedy deletion section, under the title "A page may be deleted if...", but follows the title "A file may be deleted if...". So apparently files can't be speedily deleted for being out of scope. Wnt (talk) 17:32, 30 May 2010 (UTC)
- Yes as a Regular deletion. A regular deletion means that you have to nominate the file for deletion and then other users can comment and after 7 (?) days some admin decides. :-) --MGA73 (talk) 20:59, 29 May 2010 (UTC)
- To me it looks like scope is listed as a reason in Commons:Deletion policy, since April 2008.[7] Wnt (talk) 17:40, 29 May 2010 (UTC)
- COM:SD and Commons:Deletion_policy#Speedy_deletion does not list scope as a valid speedy deletion reason. If you read Commons:Deletion_policy#Regular_deletion you will find this "The file is not realistically useful for an educational purpose." So our policy is very clear. I agree that some admins speedy delete without following our policy but that does not mean that it is ok. If we want scope to be a speedy then we should change policy to match that. --MGA73 (talk) 10:42, 29 May 2010 (UTC)
Lets have some "votes" to indicate opinion on speedy vs normal DR
[edit]As you can see above our policy says that "scope" is not a valid reason to speedy. However, some admins speedy penis-images etc. In the proposed Commons:Sexual content it is suggested that stope IS a valid reason to speedy. I'm not fond of the idea since history has shown us that scope can be discussed and some admins think we should nuke it and later it is restored. Jimbo Wales was even desysopped because of this. Therefore I prefer a DR to avoid new conflicts. Other users have argued that speedy deletions can be ok for example to avoid too many low res penis-images to fill up Commons. I therefore suggest we have some "votes" to indicate the opinion to this question:
Should we have an exception to sexual content images that allows speedy deletions?
Support = yes we can speedy and Oppose = no we should follow the general rule to start a DR. --MGA73 (talk) 09:59, 4 June 2010 (UTC)
Deletions because of request to OTRS
[edit]I think that it should be made clear what happens after an OTRS with an deletion request has been recieved. When can image be speedy deleted and when do we need a DR. Example: If a girl is photographed on the beach without a top should we speedy or start a DR? In my opinion OTRS volounteers are not judges so the descision should be made in a DR. If the subject is underage then it should be speedied - if it is porn. --MGA73 (talk) 19:44, 28 May 2010 (UTC)
- I think the OTRS guys should discuss it among themselves and decide. Think of the ex-boyfriend/ex-girlfriend scenario (someone uploading sexually explicit or nude content featuring a former partner). If the person asks for the image to be taken down, and instead you start a public DR, this increases the harm, and moreover breaks the promise that any such mails will be treated in confidence. --JN466 16:27, 29 May 2010 (UTC)
- COM:OFFICE now redirects toward a policy that has always affected Wikimedia Commons, allowing for a removal by action from Wikimedia. I don't know the details of who within OTRS can initiate removals and whether these always count as office actions according to that policy. Wnt (talk) 17:42, 29 May 2010 (UTC)
- If an image is a copyvio or if it is illegal then there is no reason for a DR. But in other cases a discussion might be good. Private information like name or mailadress or whatever should ofcourse not be made public. But it should be possible to start a DR with a message like that: "Image is taken in a public place. The person in the image requested a deletion per <link to OTRS-ticket>."
- As for the office actions I expect them to be rather limited. Jimbo was desysopped for his recent actions - I take that as a clear signal that you should follow policy even if you work for Wikimedia. --MGA73 (talk) 20:54, 29 May 2010 (UTC)
- You are forgetting that the openly available media file itself will already represent an invasion of privacy: so in that sense it is too late to withhold private information. Now, posting at DR that the person (who by definition will be depicted in the nude, or engaged in a sex act, without having wanted the image of themselves to be published) has asked for the image to be deleted, and putting their request up for a public discussion that may remain open for months or weeks before it is closed, would be a Kafkaesque experience for someone who really, really wishes the media file to be gone. It's just not good manners to insist on keeping or debating it when we have no shortage of similar media. --JN466 21:46, 29 May 2010 (UTC)
- I gave one exampe of images that should be speedies (underage) and one example where a speedy is perhaps not a good idea (images taken on a public place). You say "invasion of privacy" and "we have no shortage of similar media". You can't know that without looking at the image that someone wants deleted. Not all "nude images" are taken in private. Most people has seen this image http://news.bbc.co.uk/olmedia/1235000/images/_1235905_vietnam_ap_150.jpg of the naked girl fleeing a napalm attack in Vietnam. You can't say we have many similar images and since it is taken in a public place it is not illegal. I agree that we could delete an image like that "to be nice" but should one single OTRS users decide? What I was asking for some guidelines for when it is ok to speedy and when we should start a DR. If it is a clear case then DR should not take months. --MGA73 (talk) 22:54, 29 May 2010 (UTC)
- Perhaps we're talking at cross purposes. I am talking about this section. Are you? --JN466 02:14, 30 May 2010 (UTC)
- I'm a bit confused. The policy says that people can contact OTRS to have their case handled confidentially. Now either someone at OTRS goes through an office action, which is a preexisting policy, or they go through Wikimedia Commons procedures, which are only slightly modified and elaborated in the section on speedy deletions. I don't see anything in the policy that creates a third class of procedures for OTRS besides these. For seriously sensitive material only an office action would really be effective — for example, I think we would be foolish to believe that none of several hundred admins with access to deleted material would ever release a naked surreptitious photo of a celebrity, say, if it were sought by paparazzi willing to pay money. Wnt (talk) 03:06, 30 May 2010 (UTC)
- Yes we are talking about the same section. If someone send a mail to OTRS saying "This is me having sex and I'm only 17" then I would not call a deletion an "Office action" but never mind what we call it. As I said underage porn should be speedied and the example someone gave above of nude images taken somewhere private should also be speedied.
- But I try to make it more clear: If Madonna is topless on a public beach and does not like images to be on Commons then she can start a DR or send a mail to OTRS. Why should the DR be handled differently if a mail is send to OTRS than if she starts a DR? Or if someone starts a nudeprotest against something and run naked around in the streets why should one single participant could say "Hey that is me to the right. You can see one of my tits - please delete the image" and then one single OTRS user should decide if the image should go? --MGA73 (talk) 14:39, 30 May 2010 (UTC)
- I understand. We discussed earlier what sort of situations this should apply to. We are primarily thinking of nude or sexually explicit photos taken in a private context. Private settings include nudist beaches, photos taken in someone's home, photos taken at a private club, or photos taken in a private setting in nature, where no other onlookers were present. There was agreement that public nude parades (nude cycling etc.) were not private. Nudity at a pop concert would, to me, be a borderline case. I've tweaked the section to make it clear that we are talking about media files originating from a private, non-public context: [8]. Does this work for you? --JN466 15:53, 30 May 2010 (UTC)
- Looking good :-) If some other examples is found in future then we can always update policy! --MGA73 (talk) 15:56, 30 May 2010 (UTC)
- Great! --JN466 20:24, 30 May 2010 (UTC)
- Looking good :-) If some other examples is found in future then we can always update policy! --MGA73 (talk) 15:56, 30 May 2010 (UTC)
- I understand. We discussed earlier what sort of situations this should apply to. We are primarily thinking of nude or sexually explicit photos taken in a private context. Private settings include nudist beaches, photos taken in someone's home, photos taken at a private club, or photos taken in a private setting in nature, where no other onlookers were present. There was agreement that public nude parades (nude cycling etc.) were not private. Nudity at a pop concert would, to me, be a borderline case. I've tweaked the section to make it clear that we are talking about media files originating from a private, non-public context: [8]. Does this work for you? --JN466 15:53, 30 May 2010 (UTC)
- I'm a bit confused. The policy says that people can contact OTRS to have their case handled confidentially. Now either someone at OTRS goes through an office action, which is a preexisting policy, or they go through Wikimedia Commons procedures, which are only slightly modified and elaborated in the section on speedy deletions. I don't see anything in the policy that creates a third class of procedures for OTRS besides these. For seriously sensitive material only an office action would really be effective — for example, I think we would be foolish to believe that none of several hundred admins with access to deleted material would ever release a naked surreptitious photo of a celebrity, say, if it were sought by paparazzi willing to pay money. Wnt (talk) 03:06, 30 May 2010 (UTC)
- Perhaps we're talking at cross purposes. I am talking about this section. Are you? --JN466 02:14, 30 May 2010 (UTC)
- I gave one exampe of images that should be speedies (underage) and one example where a speedy is perhaps not a good idea (images taken on a public place). You say "invasion of privacy" and "we have no shortage of similar media". You can't know that without looking at the image that someone wants deleted. Not all "nude images" are taken in private. Most people has seen this image http://news.bbc.co.uk/olmedia/1235000/images/_1235905_vietnam_ap_150.jpg of the naked girl fleeing a napalm attack in Vietnam. You can't say we have many similar images and since it is taken in a public place it is not illegal. I agree that we could delete an image like that "to be nice" but should one single OTRS users decide? What I was asking for some guidelines for when it is ok to speedy and when we should start a DR. If it is a clear case then DR should not take months. --MGA73 (talk) 22:54, 29 May 2010 (UTC)
- You are forgetting that the openly available media file itself will already represent an invasion of privacy: so in that sense it is too late to withhold private information. Now, posting at DR that the person (who by definition will be depicted in the nude, or engaged in a sex act, without having wanted the image of themselves to be published) has asked for the image to be deleted, and putting their request up for a public discussion that may remain open for months or weeks before it is closed, would be a Kafkaesque experience for someone who really, really wishes the media file to be gone. It's just not good manners to insist on keeping or debating it when we have no shortage of similar media. --JN466 21:46, 29 May 2010 (UTC)
- COM:OFFICE now redirects toward a policy that has always affected Wikimedia Commons, allowing for a removal by action from Wikimedia. I don't know the details of who within OTRS can initiate removals and whether these always count as office actions according to that policy. Wnt (talk) 17:42, 29 May 2010 (UTC)
< is it going a wee bit further to state that if an image is taken in a private place (loosely as defined by Jayen above) we should, in fact, require an assertion of consent from participants rather than simply offer to remove it if we're contacted - I think that would be better. Whether or not it's in the purview of this page / policy, I believe it would be best to remove images like File:Barceloneta_Girl.jpg without express subject permission. Thoughts? Privatemusings (talk) 23:26, 30 May 2010 (UTC)
- This policy is about sexually explicit content, as we've defined it. The topless and skimpy bathing suit beach photo given as an example is not within this definition, so we should not be making any new policy about it here. (In other words, a person who reads to the end of the first section and sees that it doesn't apply to a given file should be able to stop there) There are some fairly elaborate restrictions already in place in Commons:Photographs of identifiable people, and any proposals to change that policy should be done there. Wnt (talk) 04:02, 31 May 2010 (UTC)
- I'd also like to hear some more feedback about this from someone who is involved in OTRS. As far as I understand, OTRS is simply an open source system for dealing with customer emails, so when we say "OTRS" we simply mean "incoming Wikimedia e-mail". Earlier I added the Wikimedia "foundation:Contact us" page to the section about child pornography, which tells people to call, write, or e-mail info@wikimedia.org. Now the section about photos without permission points at commons-permissions@wikimedia.org. But COM:OTRS makes it sound like this is mostly if not all about copyright issues. What address will get this specific type of complaint to the person who evaluates it most quickly? I assume that all the emails about a certain project and issue are evaluated by the same group of people in the end by a single set of criteria? Wnt (talk) 04:17, 31 May 2010 (UTC)
- I agree further input from OTRS volunteers might be useful. I believe they already handle such cases anyway; George William Herbert gave an example here. In a way, requiring a consent statement for any nude or sexual media recorded in a private context is a separate issue from encouraging people who find nude or sexual images of themselves in Commons to write in. If we were to ask uploaders to provide consent statements, that could be addressed in a separate section. Currently, we are only recommending it, in the "Other legal considerations" section. --JN466 15:57, 31 May 2010 (UTC)
- Note: these issues were continued below under "Several inconsistent recommendations for evaluating whether a Commons upload is child porn". I've changed this section as explained there. Wnt (talk) 22:44, 5 June 2010 (UTC)
- I agree further input from OTRS volunteers might be useful. I believe they already handle such cases anyway; George William Herbert gave an example here. In a way, requiring a consent statement for any nude or sexual media recorded in a private context is a separate issue from encouraging people who find nude or sexual images of themselves in Commons to write in. If we were to ask uploaders to provide consent statements, that could be addressed in a separate section. Currently, we are only recommending it, in the "Other legal considerations" section. --JN466 15:57, 31 May 2010 (UTC)
- I'd also like to hear some more feedback about this from someone who is involved in OTRS. As far as I understand, OTRS is simply an open source system for dealing with customer emails, so when we say "OTRS" we simply mean "incoming Wikimedia e-mail". Earlier I added the Wikimedia "foundation:Contact us" page to the section about child pornography, which tells people to call, write, or e-mail info@wikimedia.org. Now the section about photos without permission points at commons-permissions@wikimedia.org. But COM:OTRS makes it sound like this is mostly if not all about copyright issues. What address will get this specific type of complaint to the person who evaluates it most quickly? I assume that all the emails about a certain project and issue are evaluated by the same group of people in the end by a single set of criteria? Wnt (talk) 04:17, 31 May 2010 (UTC)
Finish up, please
[edit]Per the above, I'm quite happy to set up the site notice, but we're going to need to archive this discussion page first, in order to set up the poll, statement, and so on to determine approval.
I don't think there have been any full-scale policy discussions in the, what is it, 5 years I've been on here? So here's what I propose:
1. We set up a poll to determine acceptance.
2. 2/3rds majority is needed to pass.
3. The poll is proceeded by a statement advocating for why the policy is needed.
4. We will need to watch the page, and try to deal with the editing that happens during the poll. Try and keep all good changes, but revert any changes that are really drastic, since we need to have a somewhat consistent page for people to vote on.
So, let's take 48 hours, finish up the above discussions, make any final changes, write the statement we'll use to explain our rationale, and get ready. I think we've done an excellent job in turning this into something viable, consistent with other policies and community standards, and came up with a way to deal with the real issues without compromising our principles. Further, we turned one of the worst disputes I've ever seen on Commons into a collegial discussion where we were all able to work together to get something we could all pretty much agree with. Excellent work, everyone. Adam Cuerden (talk) 16:28, 29 May 2010 (UTC)
Statement
[edit][add proposed statement here - I'm off to dinner - Adam Cuerden (talk) 16:28, 29 May 2010 (UTC)]
- Comment Yes it has taken a long time. But it is only 3 weeks since Jimbo suggested a new "rule". There is still many comments and edits so I think we should wait a week or two before we vote. --MGA73 (talk) 20:57, 29 May 2010 (UTC)
- Okay, Try for June 6th? Adam Cuerden (talk)
Other language communitiees on Commons
[edit]Please canvas and ask for input from the other language communities on Commons before going live with this, some communities will have different concerns over content and censorship than that expressed by the anglo-saxon members of our community.==KTo288 (talk) 14:00, 31 May 2010 (UTC)
concept of a standard - is anything, in fact, excluded under this policy?
[edit]In ref. to this bit; 'The material is realistically useful for an educational purpose, such as diagrams, illustrations, photographs, high quality images of body parts, or noteworthy sexual acts....' - could someone explain if, in fact, any material is likely to be excluded?
We had a bit of a chat upthread about whether or not videos from 'xhamster' (free porn site) would be a good fit for commons - and 'we' (as a project) are generally rather inconsistent about whether or not things like 'creampie' shots are 'realistically educational' (ie. see here for a deletion, and here for a 'keep') - it's my view that commons would be improved by tightening standards to exclude the later image (File:IndoMiMiPussyCream-1.jpg) - in fact, I'd like to see a rather stronger rationale being required generally. Thoughts? Privatemusings (talk) 23:37, 30 May 2010 (UTC)
I think people missed the point of the list, and began shoving in things that were occasionally useful for an educational purpose, in with things that are almost always useful. Seriously, that statement, as written, said photographs were always useful for an educational purpose. The statement now reads:
- The material is realistically useful for an educational purpose, even if it was not created for that purpose, such as File:Masturbating hand.jpg which is used to illustrate the English Wikipedia article on female masturbation. Some categories of material which are generally useful for an educational purpose include: diagrams, illustrations,[1] high quality images of body parts,[2] illustrations of the various styles of erotic art, and medical photographs of diseases.
Adam Cuerden (talk) 04:32, 31 May 2010 (UTC)
- ↑ Many illustrations have been found useful in projects, often in spite of lack of technical quality. A variety of illustrations can allow choice where photographs may be undesirable.
- ↑ These may be labelled, but, as we serve hundreds of languages, we will always need some unlabelled images to make new labeled diagrams.
- To start with, the "...PussyCream..." image would be improved by compliance with the "Descriptions" section here. We should (at least) rephrase this name, as well as things like Category:Cum dripping, into encyclopedic language. I have a hard time seriously arguing that an image like this would be helpful in, say, evaluating the role of posture in female fertility — but someone in the world might imagine a use that I haven't. Wnt (talk) 04:45, 31 May 2010 (UTC)
I've said this before, but let me remind everyone: The criteria listed in that section are the minimum criteria for it to even be considered for Commons - if it doesn't fulfil one of them, it can be speedied. Fulfilling one of them does not mean that it can't go through a DR, if it's borderline.
Do we need to make that explicit? Adam Cuerden (talk) 05:35, 31 May 2010 (UTC)
- Let me remind everyone that COM:SPEEDY is very clear; works that are out of scope should not be speedily deleted; they need to go through a deletion request.--Prosfilaes (talk) 21:48, 31 May 2010 (UTC)
- According to Prosfilaes' logic, suspected images of child pornography should not be speedily deleted either because this reason is "very clearly" not listed in the policy. When it comes to conflicts between guidelines and policies, policy should prevail. Perhaps the logical answer is to include "potentially illegal" as a reason for speedy deletion. - Stillwaterising (talk) 22:03, 31 May 2010 (UTC)
- Under my logic, when a policy, like COM:SPEEDY says that something should go through a deletion request, it should. COM:SPEEDY says nothing about child pornography. Normally I would say that "suspected images of child pornography" should be speedily deleted, but given your history of wildly accusing images of being child porn with no evidence, it leads me to think that our Puritan members would use that as an excuse to delete perfectly good images. The reason why COM:SPEEDY lists "out of scope" as a reason for a DR, not a speedy deletion, is because they are frequently controversial, and I suspect the whole reason you want these to be speeded is because that will allow admins deleting them to slip under the radar; you don't want to work with consensus.--Prosfilaes (talk) 23:01, 31 May 2010 (UTC)
- This is the main reason why I was pressing to make this a "supplement" rather than a full policy or guideline; I'd like to see the sections of this document matched up with the particular policies they would modify, and then there should be back-and-forth discussion between the two until consistency is reached. I don't think that editors should have to look for rules about speedy deletion outside of the COM:SPEEDY policy; either the text should be moved there, or "authorized" there with this document serving as a named supplement to that one.
Obscenity Guidelines Need Work
[edit]I have started a subpage/essage called Commons:Sexual content/Obscenity, however our current guideline, particularly the part "Because the Miller test is a complex, subjective test, speedy deletion for obscenity is not permitted. If works are nominated for deletion as a precaution against possible distribution of criminal obscenity, the deletion rationale should contain a Miller analysis. The deletion of content for any reason does not indicate any community consensus that the content actually is or should be considered obscene." needs immediate revision.
Wikimedia is very unlikely to ever be changed with obscenity by the federal government because the "work as a whole" (all 6+ million images) needs to be considered obscene. Prosecution on a state level is more likely. Obscenity is generally only prosecuted for "extreme sex acts" such as fisting (and other acts that are likely to cause bodily harm), excretory functions, necrophilia, and bestiality.
Yes, there are community standards that we can apply, our own. We have every right to set our own Commons Community Standards of Obscenity. I think the burden of the proof should be on the uploader to prove, through appropriate titles and detailed descriptions and talk page (at least 20+ words) what the image shows and what the potential literary, artistic, political or scientific value it may have. For example, a picture of a dog ejaculating into a person's mouth with the title "doggy suck" would arguably lack serious literary, artistic, political or scientific value. Same goes for other images that are intended to have shock value, such as many of the images found at http://www.rotten.com/.
We should set some basic standards for speedy deletion, and put the burden of proof on the uploader, not the nominator. A "detailed analysis of the Miller Test" should not be required. - Stillwaterising (talk) 15:26, 31 May 2010 (UTC)
- No, we should not set some basic standards for speedy deletion. The DR system is good. It identifies controversial deletions, and does a reasonably good job of making editors feel that they're working on a community project. These images you want speedy deleted are extremely rare and are very likely to be controversial deletions. Speedy deletion is for common images that overload the DR system and are deleted for uncontroversial reasons. DRs are a huge part of the way we set community standards.
- I get the impression you want speedy deletion so people can't argue against deletion, to force your opinion on others. That's not the way it's designed to work.--Prosfilaes (talk) 23:11, 31 May 2010 (UTC)
- still's 'doggy suck' image isn't a bad illustration (although it's a pretty nasty image ;-) - would you agree that such an image should certainly be speedied, prof? Privatemusings (talk) 23:32, 31 May 2010 (UTC)
- There's a basic problem with DR as applied to images that may violate obscenity laws and that is that the images will remain visible to the public while underway, which can be for several weeks. Meanwhile, thousands of people, including school children and the media can view the image, as well as the deletion discussion. I propose that images that cleanly fit into the category of "prosecutable obscenity" be speedily deleted then oversighted. Questionable images should be speedily deleted then brought up for deletion review. I further propose that a new group of users (with a name such as Reviewers) be created that has the ability to view deleted images. This could be a work-around until some sort of Flagged Revisions system can be devised. All of this is based on precautionary principles, not any personal dislike for said images. - Stillwaterising (talk) 00:44, 1 June 2010 (UTC)
- That's a great job at censorship. Speedy delete it, so nobody can disagree with you, then oversight it, so really no one can disagree with you, and even if they did, the image can't be retrieved. Even the images you consider questionable you speedy delete first. We need to be able to see an image to discuss it. If I trusted my fellow editors to to act reasonably, I might find your proposal more reasonable, but you've played at being a modern-day Comstock, and I don't want to give admins with your predilections the excuse to speedy delete images because the woman doesn't have DD breasts therefor must be a child or you think she might be having sexual thoughts. You want to W:WP:STEAMroll, instead of having a respect for process. If you want precautionary principles, why don't you work on the fact that we probably have more copyright infringing materials then nudity, be the latter good, bad or indifferent?--Prosfilaes (talk) 01:22, 1 June 2010 (UTC)
- Wikipedia serves many hugely important purposes, and none of us has suggested that they should deliberately expose themselves to a damaging prosecution, however unjust the obscenity laws are. But there is a way for Wikipedia to protect itself legally, and it involves getting a real legal opinion and actually deleting material — not holding uninformed discussions and doing "speedy deletions" that leave the content available to hundreds of people with admin rights. That's what these "office actions" have been about — and they've been a matter of policy from the start, because Wikipedia's risks of being damaged by prosecution have always been there. Now there's also some room for community discussion, because the community here always has the say in policies... still, practically speaking, any large brouhaha over a deletion is just going to get the same Wikimedia lawyer(s) to come out and comment publicly about the discussion in the end, so it really doesn't matter all that much. Wnt (talk) 01:57, 1 June 2010 (UTC)
- There is more risk of the Wikipedia getting destroyed by asteroid then there is of Wikipedia getting prosecuted for obscenity. I bet there hasn't been an obscenity prosecution in the US in the last decade that didn't have a real charge, like child porn, fraud or tax evasion attached to it, with obscenity being tossed in to give the prosecutor more room to plea bargain. If there were a wealth of prosecutably obscene pictures hitting Commons I might feel concern, but the entire "pornography" of Commons is about the content of one pornographic magazine sold in every gas station in America.--Prosfilaes (talk) 02:32, 1 June 2010 (UTC)
- (edit conflict) The old method of oversighting, using the oversight extension, was irreversable and could not be used to remove images. In the early part of 2009, the RevisionDelete function was released (also called supression) which IS reversable. All potentially illegal media (meaning jailtime, not just copyvios) should first be speedy deleted then referred to an Oversighter, OTRS, Mike Godwin, or whatever the staff dictates. As volunteers, we should not be making legal decisions on behalf of the foundation nor exposing ourselves to potential liabilities. - Stillwaterising (talk) 03:26, 1 June 2010 (UTC)
- Jailtime? For non-CP obscenity, but not copyright violation? On what planet do you live? Not mine. To quote from a DoJ propoganda piece[9] "the 38 adult obscenity convictions adjudicated from 2001-04 involved large and small scale Internet-Web operations, mail-order fulfillments, and common carrier shipments, as well as the operator of several “adult” hard-core pornographic obscenity book + video stores". I'm quite sure that there have been more than 10 criminal copyright convictions a year, and usually at the same levels, large scale commercial infringement, not single files at Wikimedia Commons. If we shouldn't be making legal decisions, then I would point out that deciding whether a photo is legally obscene is pretty darn hard.--Prosfilaes (talk) 03:56, 1 June 2010 (UTC)
- (edit conflict) The old method of oversighting, using the oversight extension, was irreversable and could not be used to remove images. In the early part of 2009, the RevisionDelete function was released (also called supression) which IS reversable. All potentially illegal media (meaning jailtime, not just copyvios) should first be speedy deleted then referred to an Oversighter, OTRS, Mike Godwin, or whatever the staff dictates. As volunteers, we should not be making legal decisions on behalf of the foundation nor exposing ourselves to potential liabilities. - Stillwaterising (talk) 03:26, 1 June 2010 (UTC)
- That's a great job at censorship. Speedy delete it, so nobody can disagree with you, then oversight it, so really no one can disagree with you, and even if they did, the image can't be retrieved. Even the images you consider questionable you speedy delete first. We need to be able to see an image to discuss it. If I trusted my fellow editors to to act reasonably, I might find your proposal more reasonable, but you've played at being a modern-day Comstock, and I don't want to give admins with your predilections the excuse to speedy delete images because the woman doesn't have DD breasts therefor must be a child or you think she might be having sexual thoughts. You want to W:WP:STEAMroll, instead of having a respect for process. If you want precautionary principles, why don't you work on the fact that we probably have more copyright infringing materials then nudity, be the latter good, bad or indifferent?--Prosfilaes (talk) 01:22, 1 June 2010 (UTC)
- There's a basic problem with DR as applied to images that may violate obscenity laws and that is that the images will remain visible to the public while underway, which can be for several weeks. Meanwhile, thousands of people, including school children and the media can view the image, as well as the deletion discussion. I propose that images that cleanly fit into the category of "prosecutable obscenity" be speedily deleted then oversighted. Questionable images should be speedily deleted then brought up for deletion review. I further propose that a new group of users (with a name such as Reviewers) be created that has the ability to view deleted images. This could be a work-around until some sort of Flagged Revisions system can be devised. All of this is based on precautionary principles, not any personal dislike for said images. - Stillwaterising (talk) 00:44, 1 June 2010 (UTC)
- still's 'doggy suck' image isn't a bad illustration (although it's a pretty nasty image ;-) - would you agree that such an image should certainly be speedied, prof? Privatemusings (talk) 23:32, 31 May 2010 (UTC)
- Federal obscenity trials are occurring every few months. The latest trial of John Stagliano (AKA Buttman) is set to start July 7th. There's several press releases about this, one of them is here. His indictment is here. Stagliano started a blog called http://www.defendourporn.org/ concerning the trial. Some notable obscenity cases are Max Hardcore (tried in Tampa Florida no less than 20 miles from where our servers are located and sentenced 3 years and 10 months on 10 federal counts ), Seymore Butts (plead guilty to a lesser charge on state charges), and JM Productions (acquitted, however a distributor was convicted of one federal count in Phoenix). - Stillwaterising (talk) 04:11, 1 June 2010 (UTC)
- So you're claiming that federal obscenity trials are slowing down from when they were ten a year to one every few months? That's good to know.--Prosfilaes (talk) 05:00, 1 June 2010 (UTC)
- Also, arguments that amount to “we can get away with it” ... run counter to Commons’ aims. from COM:PRP - Stillwaterising (talk) 04:17, 1 June 2010 (UTC)
- I'm not saying that we can get away with it, I'm saying that our actions should not be disproportional for obscene material and copyright violations. We can respond in a reasoned and studied manner, not speedy deleting on sight.--Prosfilaes (talk) 05:00, 1 June 2010 (UTC)
- Also, the idea that any member of the Foundation (WMF) is going to receive criminal charges for copyright infringement is way off base. WMF is paranoid about copyright infringement yet oblivious to obscenity and child pornography laws. The Online Copyright Infringement Liability Limitation Act protects WMF from liability (lawsuits) as long as they "take down" any material identified as copyright infringement. WMF's greatest legal liability is pertaining to suspected child pornography and compliance with related regulations. Knowingly hosting obscene images could also cause major harm to WMF's reputation, and possible prosecution in the state of Florida or California. - Stillwaterising (talk) 04:43, 1 June 2010 (UTC)
- The idea that any member of the Foundation is going to receive criminal charges for obscenity is way off base. No one plans to knowingly host obscene images. It just so happens that the line for obscene is very high, and very few images Commons has ever hosted could even arguably be considered obscene. The issue to me is that I don't want all our images of parrots deleted and oversighted because an admin agrees with you that they're obscene. We act carefully and proportionally, in the same way we'd act for a presumed copyright violation.--Prosfilaes (talk) 05:00, 1 June 2010 (UTC)
- Here's my logic: speedy deletion, being unilateral, takes only a single person's opinion into account, and that person is generally not a lawyer. It should only be used for cases that either place a person at serious risk, place WMF at extreme legal risk, or occur so frequently that deletion requests would be overwhelmed by the sheer volume of requests. The penalties for distribution of obscenity do not justify rapid unilateral action of any sort, particularly in light of how subjective they are to evaluate, and they arise extremely rarely. I will strongly oppose any speedy deletion criteria for criminally obscene content. Dcoetzee (talk) 10:39, 1 June 2010 (UTC)
- An underlying question I have here is: does ruling material obscene make it not obscene? For example, we have a mention of Max Hardcore videos in the draft. Now the way I see it, the moment that some court rules that the videos are obscene, the obvious response of people is to question what it is about them that is so special, and to dispute the laws on a political basis. So the way I see it, no sooner is the material ruled obscene than it gains legitimate political significance. I would further suggest this is true even before the ruling is made on it.
- Even aside from the political question, one can imagine other uses for "obscene" material than were not envisioned in the original case. For example, since the Max Hardcore videos are supposed to be so repulsive, it might be interesting to screen them for some chimps or bonobos to see whether the apes have their own instinctive sense of revulsion. How many volts of electric shock will one tolerate, to avoid watching such a thing...? Wnt (talk) 15:36, 1 June 2010 (UTC)
- Now the importance of this academic point is that if an obscenity trial puts things on an Official RC Banned List and they are obscene evermore for any purpose, then that creates the potential that Wikimedia will be looking for some knee-jerk protection to flee from trammeling orcs. But if we can't say with any certainty that the content will be ruled obscene under other circumstances, then the situation is different. Wnt (talk) 15:36, 1 June 2010 (UTC)
- You've compellingly illustrated yet another reason that obscenity law is stupid. I hope within our lifetimes to see it repealed. At the moment, we should at least be clear about what impact, if any, it should have on Commons content and deletion discussions. I think the answer is, editors who want to go into a legal panic about works being obscene, whether admins or not, should be encouraged to think about it in terms of the Miller test and legitimate social value - and prohibited from engaging in speedy deletion and stifling discussion. Dcoetzee (talk) 15:50, 1 June 2010 (UTC)
- Though I appreciate the sentiment, that isn't actually an answer to the question... Wnt (talk) 23:18, 1 June 2010 (UTC)
- I neglected to do this before, but for future reference I should dispute two of Stillwaterrising's initial statements. First, I don't think there's any chance that anyone (federal or state) would consider Wikimedia Commons as a "work as a whole". Reasons include: different pages have different licensing terms; authors aren't liable for pages they didn't upload; Wikimedia claims ECPA service provider status; the pages used to require attribution of all authorship history under the GFDL but only for the one page; no one reads it all, even the authors; a page might be a copy of a page elsewhere taken as a standalone work. While there was some appeal in maintaining the pretense that it could be otherwise, that kind of "security through obscurity" is only asking for trouble.
- Second, Commons standards on what comment should be maintained are not the same as standards about who should be sent to jail. "Community standards" in the obscenity sense describe when a clerk sells an agent a magazine in a brown wrapper and is busted and hauled off to jail. "Community standards" in the sense used at the start of this section describe when the owner gets fed up with complaints about some trashy magazines he keeps under the counter and has the clerk throw some of them out because they aren't selling. See the difference? Wnt (talk) 21:53, 2 June 2010 (UTC)
- Though I appreciate the sentiment, that isn't actually an answer to the question... Wnt (talk) 23:18, 1 June 2010 (UTC)
- You've compellingly illustrated yet another reason that obscenity law is stupid. I hope within our lifetimes to see it repealed. At the moment, we should at least be clear about what impact, if any, it should have on Commons content and deletion discussions. I think the answer is, editors who want to go into a legal panic about works being obscene, whether admins or not, should be encouraged to think about it in terms of the Miller test and legitimate social value - and prohibited from engaging in speedy deletion and stifling discussion. Dcoetzee (talk) 15:50, 1 June 2010 (UTC)
poor examples?
[edit]I wonder if some of the examples given are actually helpful? - For example, if I filmed my own personal version of wilde's salome - not only is that material likely to be illegal, but I sincerely doubt that it would be of value to commons. The text, as written, would seem to suggest that it would be welcomed. Further - I don't really agree that the use of a photo of a female masturbating is unambiguously educational - I actually feel that that's a matter for discussion, and I don't think it's at all clear that that image, even in context, is, in fact, a good fit either for commons, or wikipedia. In regard to this policy, perhaps examples which don't really illustrate their intended points would be better removed? - I'm minded to do so pending discussion..... cheers, Privatemusings (talk) 23:30, 31 May 2010 (UTC)
- I'm not opposed to such a change - the examples are more a matter of previous discussions than any shining precedent. I'll try cutting this out and see if it sticks. Wnt (talk) 01:32, 1 June 2010 (UTC)
- Which is exactly why I oppose the speedy deletion rules, given that you think it's not at all clear that this image, which is in use on 27 Wikipedias, is a good fit for Wikipedia. You're wrong; a Wikipedia is, with some rules, what its editors want it to be, and the editors of many Wikipedias have said that they want their Wikipedia to be one that includes that picture.--Prosfilaes (talk) 02:01, 1 June 2010 (UTC)
- Note: although I don't think we need to keep the example, I don't think that a presentation of Salome would be illegal — not unless they actually kill one of the thespians for the performance (and even then it wouldn't be illegal to copy and distribute). There's no law against playing dead. It was mostly for this reason that I disliked the example. Wnt (talk) 02:07, 1 June 2010 (UTC)
- I don't see how Salome would be illegal either. It's out of copyright, and, unless you kill the thespian to get the head, instead of usin a simulated head, and simulated death of Salome... Adam Cuerden (talk) 07:51, 1 June 2010 (UTC)
- As far as I know there are no laws against distributing depictions of murder - we don't really want footage of battles, wars, and assassinations (e.g. the Kennedy assassination video) to be suppressed. Dcoetzee (talk) 15:53, 1 June 2010 (UTC)
- Well, yes, but it'd be illegal for him to produce, and we'd have to have him arrested. Speculation aside, I don't see how a good-quality performance of a major play by a major playwright wouldn't be educational and of extremely high value to commons. Adam Cuerden (talk) 16:47, 1 June 2010 (UTC)
- As far as I know there are no laws against distributing depictions of murder - we don't really want footage of battles, wars, and assassinations (e.g. the Kennedy assassination video) to be suppressed. Dcoetzee (talk) 15:53, 1 June 2010 (UTC)
- I don't see how Salome would be illegal either. It's out of copyright, and, unless you kill the thespian to get the head, instead of usin a simulated head, and simulated death of Salome... Adam Cuerden (talk) 07:51, 1 June 2010 (UTC)
- Note: although I don't think we need to keep the example, I don't think that a presentation of Salome would be illegal — not unless they actually kill one of the thespians for the performance (and even then it wouldn't be illegal to copy and distribute). There's no law against playing dead. It was mostly for this reason that I disliked the example. Wnt (talk) 02:07, 1 June 2010 (UTC)
xhamster revisited
[edit]I've enjoyed xhamster so much, that I've now signed up, and can send messages to the various folk who upload over there. So far the only person I've heard back from is the uploader of a BDSM Group Sex video which can be seen here. Assuming permission could be granted for release under a free license - I'm curious as to the consensus here as to whether or not it would be realistically educational in commons' terms? (it certainly illustrates the genre rather well, what with the fetish gear 'n all, no?) Privatemusings (talk) 23:55, 31 May 2010 (UTC)
- I'm going to say "no". A quick review of the opposition to WikiPorn proposal 2004 would show several good reasons why (more) pornography should not be allowed. - Stillwaterising (talk) 00:15, 1 June 2010 (UTC)
- You are apparently trying to come up with "extreme" examples of sex acts in order to polarize opinions of people in the policy discussion by way of emotional involvement - please be upfront about your intentions. I watched several minutes of the video you cited and think it would be quite useful for inclusion on Commons, especially if it were a higher resolution version from which we could extract individual frames and/or shorter video segments for illustrative purposes. It's more a depiction of the porn genre than of realistic BDSM acts, but it still serves a legitimate educational purpose. Dcoetzee (talk) 10:53, 1 June 2010 (UTC)
- I hope there's not really too much emotional involvement in the linked clip, dcoet ;-) - It's not my intention at all to polarise, in fact I'm much happier heading in the opposite direction - trying to find commons ground etc. It is however vital that we can really see where everyone is coming from - personally speaking I'd like to try and work out a standard to apply to commons that doesn't include the linked clip - because I don't really believe that, on balance, it's useful for commons to host. You disagree, and to be honest, I think our current discussion / consensus model seems to support your position (to my mind, in some ways this exposes limitations of the model rather than any particularly significant weight to that position) - if the uploader at xhamster confirms that they're ok with a free license, I'll send you their info / permission etc. and you can upload the vid. if you'd like to. Cheers, Privatemusings (talk) 00:31, 2 June 2010 (UTC)
- The clip mentioned might have some educational potential somewhere in it, but it's a stretch. The main issue that comes to mind is that it is a tremendously long, repetitive video. Maybe someone will argue differently, but I don't perceive it as an entity that needs to be preserved intact to maintain its artistic integrity — is there anything notable about the author? Because if there's no particular merit to keeping it intact, then someone could upload a small portion of it (or even a still image) that demonstrates whatever educational point is asserted, and it could then be argued that the longer clip is unneeded. In any case, the video's disposition would go to a standard deletion discussion with or without this policy, and it would be up to the uploader and detractors to make their cases. Wnt (talk) 05:02, 2 June 2010 (UTC)
- I don't think the video itself is useful for illustrating articles, in light of its length - it's primarily useful as a source for creating derivative works. For the same reason we would not include a major motion picture in an article, even if it was public domain, but I still think it would be good to host it on Commons. Dcoetzee (talk) 18:11, 2 June 2010 (UTC)
- Can you think of a good parallel case? For example the fanfic film w:Star Wars: Revelations is freely served online,[10] but I'm not sure they'd go to the extent of making it free for Commons to serve for any purpose. We don't have it currently. I think if we did have it and people started watching a 47-minute feature film a few thousand times a week, we'd be in more discussions, but as it isn't actually here I don't know that for sure. My first impression (could be wrong) is that Commons shouldn't house a large bandwidth-hogging film unless at least one Wikipedia actually thinks it's worth having an article about the film in its own right, and even then we might end up rationing bandwidth (for example, if a commercial film is debuted based on those characters and downloading surges). We don't have any articles about the sexual example cited above as far as I know - I don't even remember if it had a name. Wnt (talk) 19:24, 2 June 2010 (UTC)
- Bottom line: the ability to make derivative works is not an educational use. The ability to make derivative works of a notable film or artwork is an educational use. Wnt (talk) 19:26, 2 June 2010 (UTC)
- That said, documenting your work can be a useful goal. For example, it's standard practice with restorations to upload the original scan. Adam Cuerden (talk) 19:51, 2 June 2010 (UTC)
- Suffice to say, the issue of uploading large videos is an entirely separate issue from the content of those videos. I think we can all agree that high resolution still frames, or well chosen short clips, from this video could be useful for us. Dcoetzee (talk) 23:58, 2 June 2010 (UTC)
- Bottom line: the ability to make derivative works is not an educational use. The ability to make derivative works of a notable film or artwork is an educational use. Wnt (talk) 19:26, 2 June 2010 (UTC)
- Can you think of a good parallel case? For example the fanfic film w:Star Wars: Revelations is freely served online,[10] but I'm not sure they'd go to the extent of making it free for Commons to serve for any purpose. We don't have it currently. I think if we did have it and people started watching a 47-minute feature film a few thousand times a week, we'd be in more discussions, but as it isn't actually here I don't know that for sure. My first impression (could be wrong) is that Commons shouldn't house a large bandwidth-hogging film unless at least one Wikipedia actually thinks it's worth having an article about the film in its own right, and even then we might end up rationing bandwidth (for example, if a commercial film is debuted based on those characters and downloading surges). We don't have any articles about the sexual example cited above as far as I know - I don't even remember if it had a name. Wnt (talk) 19:24, 2 June 2010 (UTC)
- I don't think the video itself is useful for illustrating articles, in light of its length - it's primarily useful as a source for creating derivative works. For the same reason we would not include a major motion picture in an article, even if it was public domain, but I still think it would be good to host it on Commons. Dcoetzee (talk) 18:11, 2 June 2010 (UTC)
- The clip mentioned might have some educational potential somewhere in it, but it's a stretch. The main issue that comes to mind is that it is a tremendously long, repetitive video. Maybe someone will argue differently, but I don't perceive it as an entity that needs to be preserved intact to maintain its artistic integrity — is there anything notable about the author? Because if there's no particular merit to keeping it intact, then someone could upload a small portion of it (or even a still image) that demonstrates whatever educational point is asserted, and it could then be argued that the longer clip is unneeded. In any case, the video's disposition would go to a standard deletion discussion with or without this policy, and it would be up to the uploader and detractors to make their cases. Wnt (talk) 05:02, 2 June 2010 (UTC)
- I hope there's not really too much emotional involvement in the linked clip, dcoet ;-) - It's not my intention at all to polarise, in fact I'm much happier heading in the opposite direction - trying to find commons ground etc. It is however vital that we can really see where everyone is coming from - personally speaking I'd like to try and work out a standard to apply to commons that doesn't include the linked clip - because I don't really believe that, on balance, it's useful for commons to host. You disagree, and to be honest, I think our current discussion / consensus model seems to support your position (to my mind, in some ways this exposes limitations of the model rather than any particularly significant weight to that position) - if the uploader at xhamster confirms that they're ok with a free license, I'll send you their info / permission etc. and you can upload the vid. if you'd like to. Cheers, Privatemusings (talk) 00:31, 2 June 2010 (UTC)
< turned out the vid. was a copyvio from 'Private Fetish' - I'm now in touch with them about releasing a high def version! We'll see how it goes.... Privatemusings (talk) 02:27, 17 June 2010 (UTC)
Jay Walsh claims all responsibility on Jimbo's behalf
[edit]http://blog.wikimedia.org/2010/clarifying-recent-coverage-of-wikipedia/
“ | Jimmy is actively engaged in discussions with other Wikimedia editors about sexually-explicit materials on Wikimedia Commons | ” |
No, he's not. If he was, I suspect a sizeable chunk of us wouldn't be here. He's completely lost the trust of the Commons community by actively lying and misleading us as to his intents and motivations.
“ | and although the discussions over the past week have been unusually intense, we don’t consider them problematic. Discussion is how Wikimedians work through policy development and policy interpretation: active argument and debate are normal for us — they are how we do our work. The Wikimedia Foundation is grateful for Jimmy’s involvement, and we’re glad he continues to be an important part of the Wikimedia movement. | ” |
Actually, Jimbo was highly disruptive, edit-warred to force artworks to be included in the list of material to delete, threatened people, and no progress was or could be made until he left completely so volunteers could step in.
Adam Cuerden (talk) 06:33, 1 June 2010 (UTC)
- Meh, it's PR. As much as Jimbo's actions were destructive and stupid, he does still need to be the "face of Wikipedia" to the public. Let them do damage control if they want, we know what happened. Dcoetzee (talk) 10:55, 1 June 2010 (UTC)
- Agreed. Let them say what's needed to the public. Let's face it - us admins and editors are pretty much the black gang that keep Wikipedia up and running. Tabercil (talk) 12:12, 1 June 2010 (UTC)
- I've been opposed to Jimbo Wales' particular policy, but I'm not prepared to condemn his actions outright. After all these discussions it has become clear that Commons policies were being widely ignored, leading to a glut of amateur porn content, and (while there is a certain "-m" that I don't understand about that stats.grok.se server, and it's not impossible or even that unlikely that the stats were influenced by connivance of some anti-porn group) I think the degree to which this off-topic material was draining Wikipedia resources was not entirely negligible. Jimbo Wales broke some policies to force the issue and kick-start more deliberate processes. That's his prerogative as Founder and it is not a bad thing per se, provided that the community uses its own prerogative to discuss the issues and preserve all useful content. Wnt (talk) 15:44, 1 June 2010 (UTC)
- Well, AFAIK WP:POINT is some kind of blockable offence for an ordinary user, but some pigs are more equal than others... Kameraad Pjotr 21:05, 1 June 2010 (UTC)
- As I recall it, around sixty images ended up being deleted. That's one hundredth of a percent of Commons files, or put other another way, 10 parts per million. I can hardly see that as a glut. Not only that, in my experience as a DR participant, these images were given no more leeway then any other images; the images of the girl in the shower were treated the same way as the images of the girl in the park with the squirrel, for example.--Prosfilaes (talk) 23:00, 1 June 2010 (UTC)
- (edit conflict) We had a situation where Wikipedia was getting beaten up by a hostile news network for material that never belonged on the server in the first place according to our own policies. Jimbo's efforts to fix that were made in good faith. Though he made a few errors about what to delete, I know too well that the ordinary deletion process also makes errors quite often. This was a Western where the marshal comes to town and starts throwing the gunslingers out, and people want to cite him for police brutality. But who would have been ready to hand out such a ticket before he arrived? Both Wales' action and your indignant response have place and merit in a balanced and rational rule by consensus. Wnt (talk) 23:03, 1 June 2010 (UTC)
- I've been opposed to Jimbo Wales' particular policy, but I'm not prepared to condemn his actions outright. After all these discussions it has become clear that Commons policies were being widely ignored, leading to a glut of amateur porn content, and (while there is a certain "-m" that I don't understand about that stats.grok.se server, and it's not impossible or even that unlikely that the stats were influenced by connivance of some anti-porn group) I think the degree to which this off-topic material was draining Wikipedia resources was not entirely negligible. Jimbo Wales broke some policies to force the issue and kick-start more deliberate processes. That's his prerogative as Founder and it is not a bad thing per se, provided that the community uses its own prerogative to discuss the issues and preserve all useful content. Wnt (talk) 15:44, 1 June 2010 (UTC)
- Agreed. Let them say what's needed to the public. Let's face it - us admins and editors are pretty much the black gang that keep Wikipedia up and running. Tabercil (talk) 12:12, 1 June 2010 (UTC)
"A few errors?" Wasn't something like two-thirds of what he deleted undeleted? Adam Cuerden (talk) 01:31, 2 June 2010 (UTC)
- 90% maybe. -mattbuck (Talk) 02:14, 2 June 2010 (UTC)
- It seems to me to have been a surprisingly inept action, but one undoubtedly taken in good faith. The main problems seem to have been (1) he panicked and went off half-cocked without building any consensus and (2) in the process of doing so, he invoked his authority as founder and as a Board member. The latter is what I think alienated the several admins and others who quit over this. - Jmabel ! talk 04:47, 2 June 2010 (UTC)
- According to the hand-made deletion log [11] there is a lot more red than blue, i.e. more than 50% of the deletions stuck. Of course, since we're arguing the policy, that doesn't prove anything. But I think that if you take a small random sample of Jimbo's deletions and look at them, you'll find quite a few that were not really important in educational or artistic terms. Wnt (talk) 05:12, 2 June 2010 (UTC)
- Using that log, and by my count, Jimbo deleted 76, and 39 were undeleted, hence 51%. Which is far too high. Perhaps I shouldn't have brought this up - it's a bit off-topic - it just annoys me to see people claim that Jimbo's actions were entirely for the good. Adam Cuerden (talk) 06:39, 2 June 2010 (UTC)
- If you take a small random sample of Commons works, "you'll find quite a few that [are] not really important in educational or artistic terms." Equal standards.--Prosfilaes (talk) 12:49, 2 June 2010 (UTC)
- Doing a find and replace on the HTML source I count 290 red to 149 blue, or 66.06% that have remained deleted; though I suspect (for example) some of the "Hoden"/"Saline infusion" series should still be undeleted (a proper circus isn't complete without the freak show).
- I've agreed from the beginning that Jimbo Wales' campaign had major flaws,[12] and I'm not saying that his actions were "entirely for the good"; the point is, he took rapid action in response to a campaign against Wikipedia's financial survival, after the rest of us — the people here now, and more so the people still not here — failed to do our part to help prevent Wikimedia bandwidth and reputation from being wasted on completely non-educational files in a way that wasn't really sustainable.
- These projects have a wide streak of anarchy, but we have to remember that anarchy's greatest vulnerability tends to be an excessive conservativism that makes it difficult to respond to changing circumstances; and the remedy for this requires great respect for founders and innovators who have proved their trustworthiness by their works. Wnt (talk) 14:19, 2 June 2010 (UTC)
- According to the hand-made deletion log [11] there is a lot more red than blue, i.e. more than 50% of the deletions stuck. Of course, since we're arguing the policy, that doesn't prove anything. But I think that if you take a small random sample of Jimbo's deletions and look at them, you'll find quite a few that were not really important in educational or artistic terms. Wnt (talk) 05:12, 2 June 2010 (UTC)
mini presentation
[edit]I started work on (another) mini video presentation to try and encourage further discussion about some of these issues. I did a voiceover, but I thought it sounded rubbish - does anyone have any interest in working on a voiceover together? - Or you can just use the video, and create your own voiceover if you'd prefer. I think a video presentation is a fun and engaging way of discussing the issues. If you're over the age of 18, you can see the presentation here. Privatemusings (talk) 04:36, 2 June 2010 (UTC)
- This video presentation wastes a lot of space, taking still pictures and slowly panning/expanding them on the screen. (There's no artistic value added here) Also, it doesn't say where the soundtrack music comes from - does it have a Commons-compatible license? I think it's a fair candidate for deletion as a redundant image or as not being educational. (Put it this way - if a Commons gallery listing so-called "porn" images is excluded by the subpage of commons:scope, then the video is surely more egregious). Someone tell me a reason not to propose this for deletion. Wnt (talk) 08:20, 3 June 2010 (UTC)
- The music source (commons) was listed as a file - I've just made that a bit clearer (I agree it was kinda hidden) - and I've added the rationale 'Uploaded to seek collaboration for voiceover script to highlight issues surrounding sexually explicit material on WMF projects' - which I'd hope was clear from the above. The pan and zoom thing is just what my free program (picasa) does for me - I'll see if I can work out how to do it smarter (or could you lend a hand?) - is there any irony in a video intended to raise awareness of issues relating to sexually explicit material with a view to tightening up policy / practice / standards being nominated for deletion ;-) ? cheers, Privatemusings (talk) 08:32, 3 June 2010 (UTC)
- Sorry, I've still nominated it for deletion, because (as I explain [13]) it is in every way inferior to a user page gallery for discussion of these issues, and clearly falls into a category of deletable material.
- In general, I think this discussion tactic has much in common with things like the Proxmire w:Golden Fleece Awards (named after a Senator who publicly humiliated scientists for studying the sex life of the tsetse fly, even though sleeping sickness carried by the flies is 100% lethal and sexual control e.g. by mating with irradiated flies was a valid way to save thousands or even millions of lives). At least here the demagoguery of the tactic is meant to kill only files, but it still isn't fair to round up a handful of examples that will seem strange to those outside the project, without considering each one on its individual merits. Wnt (talk) 14:45, 3 June 2010 (UTC)
- The music source (commons) was listed as a file - I've just made that a bit clearer (I agree it was kinda hidden) - and I've added the rationale 'Uploaded to seek collaboration for voiceover script to highlight issues surrounding sexually explicit material on WMF projects' - which I'd hope was clear from the above. The pan and zoom thing is just what my free program (picasa) does for me - I'll see if I can work out how to do it smarter (or could you lend a hand?) - is there any irony in a video intended to raise awareness of issues relating to sexually explicit material with a view to tightening up policy / practice / standards being nominated for deletion ;-) ? cheers, Privatemusings (talk) 08:32, 3 June 2010 (UTC)
Convention on Cybercrime international treaty
[edit]I was reading some conversation on copyright law and noticed mention of the Berne Convention and Digital Millennium Copyright Act and wondered if such a treaty exited regarding child pornography. Such a treaty does, and It's called the Convention on Cybercrime and it was adopted in 2001. It was ratified by the United State in 2006 and took effect in 2007. It also has been signed, ratified, and in effect in most European states (list here). The relevant section is below and full text can be found here.
Article 9 – Offences related to child pornography
1. Each Party shall adopt such legislative and other measures as may be necessary to establish as criminal offences under its domestic law, when committed intentionally and without right, the following conduct:
- a. producing child pornography for the purpose of its distribution through a computer system;
- b. offering or making available child pornography through a computer system;
- c. distributing or transmitting child pornography through a computer system;
- d. procuring child pornography through a computer system for oneself or for another person;
- e. possessing child pornography in a computer system or on a computer-data storage medium.
2. For the purpose of paragraph 1 above, the term "child pornography" shall include pornographic material that visually depicts:
- a. a minor engaged in sexually explicit conduct;
- b. a person appearing to be a minor engaged in sexually explicit conduct;
- c. realistic images representing a minor engaged in sexually explicit conduct.
3. For the purpose of paragraph 2 above, the term "minor" shall include all persons under 18 years of age. A Party may, however, require a lower age-limit, which shall be not less than 16 years.
4. Each Party may reserve the right not to apply, in whole or in part, paragraphs 1, sub-paragraphs d. and e, and 2, sub-paragraphs b. and c.
Please note that all English speaking countries that have not ratified this treaty do have comparable laws in effect. - Stillwaterising (talk) 22:30, 2 June 2010 (UTC)
- Note however that Ashcroft v. Free Speech Coalition protected so-called "virtual child pornography" unless it could be ruled obscene by a Miller test. Putting a domestic law into a treaty does not allow the President and Senate to circumvent the Constitution. Wnt (talk) 00:49, 3 June 2010 (UTC)
- That would be section 4, they may choose not to apply section 2.c.--Prosfilaes (talk) 01:38, 3 June 2010 (UTC)
- Actually, according to Legal status of cartoon pornography depicting minors, that ruling has been superseded by 18 U.S.C. 1466A "which specifically criminalizes the possession and distribution of virtual and comic images of apparent children if they are engaged in sexually explicit conduct that is "obscene" under the Miller standard, as directly suggested in the Ashcroft ruling." - Stillwaterising (talk) 02:28, 3 June 2010 (UTC)
- That would be section 4, they may choose not to apply section 2.c.--Prosfilaes (talk) 01:38, 3 June 2010 (UTC)
- So what? How does laws about child porn have anything to do with the issues under discussion? It's rarely shown up on Wikimedia Commons, and it's been speedily deleted when it has. I have no concern that the Wikimedia Foundation will be treated any different from Youtube or Flikr if we get a picture of a 16 year-old who claims to be 18 appears; if the FBI gets notified, we'll get told to take it down now, and we will. It's a red herring.--Prosfilaes (talk) 01:38, 3 June 2010 (UTC)
- The thing we're arguing about isn't child porn, but certain types of drawings or photos of people other than children. At the far extreme you have the more totalitarian countries, like Australia, prosecuting people for distributing cartoons of Homer Simpson characters and banning magazines from showing the breasts of women in their late 20s, if they're small breasts, because it "tends to promote pedophila". (If the photos are that bad, I don't know what they do with the women, who are surely more damaging to the psyche than their mere images. Maybe brand them across the face and make them wear dog poo for a necklace? I don't think that just publicly decreeing that small-breasted women aren't women really objectifies them enough to satisfy the censors, do you? Well, maybe that's for next year. But I digress...)
- Anyway, the U.S. congress tried to "repair" the damage the Supreme Court did by passing an obscenity law against drawings of apparent children having sex, but the point is, it's still subject to the same test as any other kind of obscenity. So for all practical purposes the various weird Japanese cartoons are apparently still legal in the U.S. Wnt (talk) 08:07, 3 June 2010 (UTC)
- I'm with Wnt here - the law in question regards only material that is legally obscene, which is of course already illegal, and already described in this policy. They only increased penalties. Dcoetzee (talk) 15:09, 3 June 2010 (UTC)
Beware : rules shouldn't lead to the deletion of historical depictions of sexuality !
[edit]It would be an auto-da-fe to delete old illustrations like these ones with such a pretext. Following the law, why not. But following it blindly and deleting artistic history, no ! --TwoWings * to talk or not to talk... 07:08, 3 June 2010 (UTC)
- The bottom line is that if it's illegal it's illegal, and the WMF is going to run into trouble, no matter how idiotic we may feel the law may be. But we should leave such legal decisions to real lawyers at the WMF and proper deletions by office actions and "suppression" or "oversight", because it is clear that many of us (myself included) are not exactly ready for Harvard. Wnt (talk) 08:12, 3 June 2010 (UTC)
- that category and those illustrations trouble me greatly to be honest - though I'm not qualified to say whether or not they're illegal or legal. The legislation I'm vaguely aware of (though getting better informed) would seem to me to indicate that some of that material may well be illegal - I think we'd agree that if that were so, the material should be immediately, and permanently, removed. Privatemusings (talk) 08:13, 3 June 2010 (UTC)
- ps. Would some sort of system for referring inevitable discussions about legality through the appropriate channels be worth establishing / communicating. As I mentioned, I really do wonder if some of those images (the erotic activities involving children) might be illegal (they certainly seem to me to be so in the UK and Australia) - a note about how I (or a future third party about a future matter) go about that might be a good idea. Privatemusings (talk) 08:18, 3 June 2010 (UTC)
- In a way I don't understand why it would be illegal. Murdering is illegal, filming a murder is illegal, drawing a murder is not. Why would it be different with such a subject. It's even more stupid that some of these drawings are depictions of consentant underage sex (including underage couples). Sexuality exists between underage people and I don't think this is actually illegal for them to have such pratices (at least not in France but well I wouldn't be surprised if it were illegal in the US...) --TwoWings * to talk or not to talk... 06:23, 4 June 2010 (UTC)
- Well, the draft guideline still gives two different e-mail addresses to contact Wikimedia with questions. But I don't think this is a questionable case — according to the Supreme Court these old drawings could only be criminalized as obscenity, subject to a Miller test, and they have both artistic and historical significance. Of course, censorship isn't always enforced rationally... Wnt (talk) 08:25, 3 June 2010 (UTC)
- ps. Would some sort of system for referring inevitable discussions about legality through the appropriate channels be worth establishing / communicating. As I mentioned, I really do wonder if some of those images (the erotic activities involving children) might be illegal (they certainly seem to me to be so in the UK and Australia) - a note about how I (or a future third party about a future matter) go about that might be a good idea. Privatemusings (talk) 08:18, 3 June 2010 (UTC)
- that category and those illustrations trouble me greatly to be honest - though I'm not qualified to say whether or not they're illegal or legal. The legislation I'm vaguely aware of (though getting better informed) would seem to me to indicate that some of that material may well be illegal - I think we'd agree that if that were so, the material should be immediately, and permanently, removed. Privatemusings (talk) 08:13, 3 June 2010 (UTC)
- Three of those images depict very young children engaged in sexual acts - one of which is a young girl giving a blow job and one that shows a boy peeing which, although not offensive to me and certainly doesn't raise any excitement in me, could excite a person of paedophilic nature. It is pretty subjective problem when it comes to underage children with underage children. In the strictest definition of some of the laws from various countries ALL 15 year olds would be paedophiles if they fantasise about sex with those of the opposite, or same even I suppose, sex who are the same age group as themselves.
- IMO the files La Grande Danse macabre des vifs - 29.jpg, Trilogie érotique 02.jpg and La Grande Danse macabre des vifs - 33.jpg are unecessary and show paedophilic behaviour and as such should be not present as they could very easily be shown as such in a court of law. THe first does not show the age of the male participant, but the other two clearly show that one participant is clearly old enough to be classed as a paedophile in law. 02:15, 3 July 2010 (UTC)
- "...could excite a person of paedophilic nature..." From what I can tell, a pedophile could get excited over a picture of fully dressed schoolchildren, and certainly over the old Coppertone ad with the girl and the dog. If we have to worry about what could excite a sufficiently perverse person, we are going to end up with the standards of Saudi Arabia... and then some burqa fetishist will come along and we'll be reduced to calligraphy so as not to excite him... - Jmabel ! talk 19:04, 3 July 2010 (UTC)
- I know I can get excited by pictures of fully clothed people, so this does seem a bit wide-ranging to me. -mattbuck (Talk) 19:16, 3 July 2010 (UTC)
Lol - nice: completely miss the point and write about it out of context as it was in regard to "an underage girl getting a blow job" and "a boy peeing" and that I then go on to clearly say that they both could excite a paedophile. I also go on to say that sex between children is a "subjective problem" (as depicted in others of the set).
However the fact that you have not bothered to address the paedophilic nature of those pics in the last part of my post - "young girl giving a blow job" with "show paedophilic behaviour" - shows that you might be missing the point. If you still havent quote woken your brains up that means that the pictures mentioned in the first and second half of my post actually depict paedophilia in action and as two clearly show that one party in the image is mature I really think that an image showing people engaged in paedophilia is much worse than anyone who might get turned on by a boy peeing.
As for the other comments I think that most religions erect towers to worship their gods and they would even say that the scribe is "writing in a lewd manner" if he was using pointy or rounded characters :¬)
Chaosdruid (talk) 14:40, 11 July 2010 (UTC)
Overall structure
[edit]While various sections of this have been polished up, we still have trouble with the overall structure and logic. Right now, we lump illegal material with other policies like the so-called "moral rights" in Commons:Photographs of identifiable people. Then we say that various good material shouldn't be speedy deleted, unless it's illegal; then we define obscenity in a section about laws of the U.S. and other countries; then we say that obscenity shouldn't be speedy deleted. While a few of us may be able to follow this logic, more or less, I pity the poor person who would read this as a policy and try to apply it. I think some kind of overall reorganization is needed, and it's not going to be easy. Wnt (talk) 15:31, 3 June 2010 (UTC)
- Yeah I'll give you that. However I'm uncertain about what a better organization would be. Maybe what we want is to have a section for each major type of restricted content (e.g., one for child pornography, one for obscenity, one for out of scope) and in each section to clearly delineate the applicable laws/policies and the suggested actions to take. Then we finish it off with a discussion of material that should generally not be speedy deleted, like art, etc. Dcoetzee (talk) 15:49, 3 June 2010 (UTC)
I'm not quite sure how to do it either, but I think the first step is that the "Definition" section should be changed to "Definitions", and several terms should be placed under it, so that we can name various classes of content so that at least we know we are agreeing on what is included in each section. I'm thinking we should define ("define" in a broad sense, including references to other policies or to laws without claiming to fully describe what they cover):
- Sexual content
- Illegal content with a sexual appearance
- Obscenity (including "virtual child pornography")
- Photographs of identifiable persons uploaded without their consent
- Material not within the scope of commons
I think we may also wish to define notability, by saying that Commons doesn't define it, but defers to the Wikipedias and related projects to define it. (e.g. a photographer is notable if there is a Wikipedia article about him; and his freely licensed and legal works should not be deleted period, even if they're, well, Mapplethorpe photos).
By setting definitions first, we have a little more leeway to keep the current convoluted structure, because we can name things by the definition without creating ambiguities. We should also stick to the original design in that this document provides guidance only for sexual content, which we define as files of the types currently specified including the description text, including categories, present on the File: pages.
Note that one peculiar consequence of this definition is that categories are potentially within the scope of this document (though I think we should defer to advice already present in Help:Category), but text articles are not — solely because categories are defined by a Category:Etc. added to the image file. So Category:Gonorrhea is potentially affected (only in terms of what is put into it) but not Gonorrhoea. I'm not really sure how the second type of page is regulated, or how necessary it is, but IMHO it's just not something for this policy to figure out from scratch. Maybe that's very simple-minded, but with so many vaguenesses we need to make some conceptual divisions that are just unarguable about. Wnt (talk) 19:57, 3 June 2010 (UTC)
Sexy teenagers
[edit]What should be our position on pictures like this one: [14] (also shown right)? Per the current draft, there is no reason why an editor shouldn't upload this picture, even though the model clearly seems underage. --JN466 01:45, 4 June 2010 (UTC)
- So now only pictures of women with gray hair and wrinkles are allowed? No thought allowed? There's no reason an editor shouldn't upload this picture, because any editor who looks twice will know that it comes from a reputable porn site that wouldn't upload pictures of underage women.--Prosfilaes (talk) 01:51, 4 June 2010 (UTC)
- Hmm. Original here. The copyright notice has been cropped out in our version. The file may be iffy from a copyright point of view. But copyright considerations aside, what makes you think the model is 18? Looks a few years younger than that to me. The image is not "sexual content" as presently defined in the draft, but I'm sorry, it's too close to paedophilia for me to feel comfortable with our hosting it. --JN466 02:10, 4 June 2010 (UTC)
- I'm sorry, but the image is from a reputable site which publishes the USSC or whatever they are info, and copyright-wise we have OTRS proof that the flickr account belongs to SG, and if they choose to publish them freely that's fine. Your opinion on a model's age is really quite irrelevant. -mattbuck (Talk) 02:23, 4 June 2010 (UTC)
- I'm likewise sorry, but I don't know what you're talking about. What's USSC? And do SG have a statement on the age of their models, and if so, could you link to it? And does the age of the model matter or not? In other words, if the model were 14, would we host the picture? --JN466 02:29, 4 June 2010 (UTC)
- The SG site's FAQ are here; 5.19 for example states a clear minimum age 18 requirement. The model ("Loretta") and her work are listed on the site. --JN466 02:41, 4 June 2010 (UTC)
- I can't remember what the acronym for the record keeping requirement is, I thought it might be USSC, I guess not. Yes, the age of a model does matter, but I find it ridiculous that you believe a well-established, famous company like SG is actually a child porn site. -mattbuck (Talk) 02:43, 4 June 2010 (UTC)
- Matt, the image is not pornographic; it is not even "sexual content" as defined in Commons:Sexual_content#Definition. Per our present definition, there is nothing in the draft policy to rule out uploading similar pictures of minors. --JN466 02:53, 4 June 2010 (UTC)
- I'm likewise sorry, but I don't know what you're talking about. What's USSC? And do SG have a statement on the age of their models, and if so, could you link to it? And does the age of the model matter or not? In other words, if the model were 14, would we host the picture? --JN466 02:29, 4 June 2010 (UTC)
- First, in no conceivable way is it pedophilia. Look the word up; she is in no way prepubescent. What makes me think that she is 18 is the fact that SuicideGirls is an established reputable nude modeling company. I find the concpet that we should go around censoring pictures of adult women with small breasts a little disturbing.--Prosfilaes (talk) 02:28, 4 June 2010 (UTC)
- They did that in Australia actually, and now it's apparently very difficult to get porn there with any actresses smaller than a D-cup. -mattbuck (Talk) 02:43, 4 June 2010 (UTC)
- Let's not quibble over words. The question is, if this model were 14, would you host the picture? --JN466 02:44, 4 June 2010 (UTC)
- But words matter, especially when people are throwing around very emotionally charged words without respect for their meaning. If she were 14, then we'd probably be better off using a different image for the same usages, though I wouldn't object to keeping it if someone could give an educational use that only it could fulfill, if for example that were a photo of Edna St. Vincent Millay at 14.--Prosfilaes (talk) 02:51, 4 June 2010 (UTC)
- If you think we shouldn't host images of 12-year-olds in this kind of attire and pose, then please think of a way we could formulate this in the draft policy. --JN466 02:54, 4 June 2010 (UTC)
- I don't have to. Even before this mess, I could have taken an picture like this of a child to DR and said "we aren't going to use this picture, because we have pictures of adults in similar attire and poses that are more appropriate. So it's out of scope and we should delete it." And then I would let it go and most likely people would agree that it should be deleted. You don't need elaborate rules unless you're trying to overrun consensus.--Prosfilaes (talk) 22:31, 4 June 2010 (UTC)
- If we are drafting a policy about sexual content, it should address difficult cases that editors are likely to meet; this is one of them. --JN466 18:31, 5 June 2010 (UTC)
- Human judgment beats dead protocol in difficult cases. We can't and we shouldn't even try to set up a bunch of rules for unprecedented cases; we should let the wisdom and discretion of those discussing the exact case at hand guide them.--Prosfilaes (talk) 12:33, 6 June 2010 (UTC)
- If we are drafting a policy about sexual content, it should address difficult cases that editors are likely to meet; this is one of them. --JN466 18:31, 5 June 2010 (UTC)
- I don't have to. Even before this mess, I could have taken an picture like this of a child to DR and said "we aren't going to use this picture, because we have pictures of adults in similar attire and poses that are more appropriate. So it's out of scope and we should delete it." And then I would let it go and most likely people would agree that it should be deleted. You don't need elaborate rules unless you're trying to overrun consensus.--Prosfilaes (talk) 22:31, 4 June 2010 (UTC)
- If you think we shouldn't host images of 12-year-olds in this kind of attire and pose, then please think of a way we could formulate this in the draft policy. --JN466 02:54, 4 June 2010 (UTC)
- But words matter, especially when people are throwing around very emotionally charged words without respect for their meaning. If she were 14, then we'd probably be better off using a different image for the same usages, though I wouldn't object to keeping it if someone could give an educational use that only it could fulfill, if for example that were a photo of Edna St. Vincent Millay at 14.--Prosfilaes (talk) 02:51, 4 June 2010 (UTC)
- I'm sorry, but the image is from a reputable site which publishes the USSC or whatever they are info, and copyright-wise we have OTRS proof that the flickr account belongs to SG, and if they choose to publish them freely that's fine. Your opinion on a model's age is really quite irrelevant. -mattbuck (Talk) 02:23, 4 June 2010 (UTC)
- Hmm. Original here. The copyright notice has been cropped out in our version. The file may be iffy from a copyright point of view. But copyright considerations aside, what makes you think the model is 18? Looks a few years younger than that to me. The image is not "sexual content" as presently defined in the draft, but I'm sorry, it's too close to paedophilia for me to feel comfortable with our hosting it. --JN466 02:10, 4 June 2010 (UTC)
- This has to be clear : 1) nudity is not pornography ; 2) nude depiction of underage people is not necessary pedopornography ; 3) the argument "she looks underage" is irrelevant (many underage girls would look more than 18 and many young women would look less than 18) ; 4) there's no reason to think that such a picture is illegal when it comes from a legal (and more or less famous) company. --TwoWings * to talk or not to talk... 06:34, 4 June 2010 (UTC)
- The topic of underage beauty contests is a significant social issue, whether the contest occurs as a walk-on or via the actions of fashion magazines. It is clear that pedophilia has a wide penumbra in our society — it is treated as if it were normal for porn seekers to be hunting always for "barely legal" models; very few sing the praises even of 25-year-olds who have filled out to basic maturity, let alone of those older. Have you ever seen anyone try to do a steamy sex scene in a movie with people over 50? Yet that is what natural sexuality looks like.
- But that said: preteen beauty contests are held, and they're held in public, and aired on television; and whatever a photo like that may do to incite pedophiles I don't think there are any specific laws about it. So it's not a sexual content issue.
- That said, the expansive Commons:photographs of identifiable people probably does have something that could be applied about it, because someone under 18 (if she is) can't (I think) legally consent to a photograph. I don't want that to automatically mean removal because such a doctrine would apply to just about any picture with a kid in it, regardless of what it was; but a certain amount of consideration should apply. I don't approve of that guideline at all (for example, I think that its precedent of banning a straight picture labeled "an obese girl" is offensive to the obese, blocks Commons use of images, and sets a precedent of editor value judgments over any possible description of a picture). But if I were to adapt the guideline as I see fit, I would say that if the girl pictured in this section made a complaint about the picture, then it would be a "potentially derogatory or demeaning" picture and could be handled in the ways suggested by that guideline.
- Oh, and the point about the psychiatric distinction between pedophilia and hebephilia is recognized; but I don't find it persuasive. The psychiatrists have tried to define the sickness to match the criminal law, but I don't believe it. Wnt (talk) 06:58, 4 June 2010 (UTC)
- He, didn't I just see this picture on wikipedia review ? What a coincidence.... TheDJ (talk) 12:10, 4 June 2010 (UTC)
- The difference between pedophilia and hebephilia has nothing to do with criminal law. It has to do with the fact that biologically speaking, fertile females are fertile females, so hebephilia is evolutionarily normal and pedophilia isn't. Psychologically speaking, the treatment and prognosis of the two syndromes is entirely different. On the other side, sex between people over 50 is not normal sexuality; without estrogen, sexuality declines quickly after w:menopause.--Prosfilaes (talk) 22:31, 4 June 2010 (UTC)
- This assumes that the role of sex is purely reproductive; but consider that humanity's closest relatives include bonobos. Wnt (talk) 12:42, 5 June 2010 (UTC)
If we are going to make the scope of this policy include images of nude or half-dressed children (or dressed children in arguably erotic postures) we will not get ready this year. Let us confine ourself to sexual content. If the above linked picture is child pornography (by legal definitions) and therefore illegal it must of course be deleted. If it is out of scope for Commons, then let's delete it because of that. We do not need any discussions here for either.
What we certainly seem to need to discuss is how to handle administrators thinking something is sexual content that is out of scope or illegal. In what cases do we need or want speedy deletions? How can we guard against abuse of that procedure?
--LPfi (talk) 07:14, 4 June 2010 (UTC)
- Comment This shows me how easy it is to say "oh no underage" when the girl is young. In this case we have a good source but often panic strikes when someone says "underage". I bet some admin would have speedied it if {{speedy|underage}} had been added to the image. Therefore I think speedy deletions should not be made. --MGA73 (talk) 09:47, 4 June 2010 (UTC)
- Fully support LPfi's statements TheDJ (talk) 12:14, 4 June 2010 (UTC)
- If the person in this photo were under 18 (which she isn't) then we should probably delete it as a precaution, as courts have found that sexually suggestive photos or movies can violate child pornography law, even in the absence of nudity. On the other hand this would not preclude images of teenagers who are merely physically attractive, say a cheerleader in a miniskirt, if the photos are not designed to appeal to prurient interests. Issues of scope and copyvio are separate. Dcoetzee (talk) 13:15, 4 June 2010 (UTC)
- We could add something about this to the draft; perhaps in the "Prohibited content" child pornography subsection, or under "Handling of sexual content". I like what you wrote above: "Courts have also found that sexually suggestive photos or movies of minors can violate child pornography law, even in the absence of nudity ...". We could recommend speedying in the absence of traceable model age and consent information. Are you aware of any relevant court decisions that one could stick in a footnote? --JN466 16:03, 4 June 2010 (UTC)
- Please stop this panic mongering. Courts have found that David Hamilton was fine. No need for speedying anything. /Pieter Kuiper (talk) 16:14, 4 June 2010 (UTC)
- I agree that ambiguous cases should be considered via deletion discussion, not with speedy deletion. Dcoetzee (talk) 17:51, 4 June 2010 (UTC)
- Fine. How about: "Courts have also found that sexually suggestive photos or movies of minors can violate child pornography law, even in the absence of nudity. Ambiguous cases should be nominated for deletion and community discussion."
- We'd still need some relevant US court decisions; without that, the statement "Courts have also found ..." lacks verifiability. --JN466 18:46, 4 June 2010 (UTC)
- I agree that ambiguous cases should be considered via deletion discussion, not with speedy deletion. Dcoetzee (talk) 17:51, 4 June 2010 (UTC)
- Please stop this panic mongering. Courts have found that David Hamilton was fine. No need for speedying anything. /Pieter Kuiper (talk) 16:14, 4 June 2010 (UTC)
- We could add something about this to the draft; perhaps in the "Prohibited content" child pornography subsection, or under "Handling of sexual content". I like what you wrote above: "Courts have also found that sexually suggestive photos or movies of minors can violate child pornography law, even in the absence of nudity ...". We could recommend speedying in the absence of traceable model age and consent information. Are you aware of any relevant court decisions that one could stick in a footnote? --JN466 16:03, 4 June 2010 (UTC)
- I think that's a mistake. This page starts off with a definition of sexual content, and is supposed to be about sexual content. So it shouldn't be suggesting rules for non-sexual content deep down in the text. If we can define what non-sexual content, as currently defined (which presumably must also not be obscene content) is nonetheless possible to be child pornography, then we might consider expanding our definition of sexual content; but otherwise any such warning (even if it prescribes no specific action) needs to go in some other more general policy.
- There are cases like this, [15][16][17] but they are contemptible injustices and I hope that are or will be resolved conclusively as unconstitutional.
- The issue I see here is that when criminal officials try to charge people with "child pornography" for content that doesn't even involve genitalia, it's absolutely impossible to predict what they'll go after. I mean, diaries, "sexting", fantasies... you could no more predict what non-sexual content will be prosecuted as child porn than you could predict which Everybody Draw Muhammad Day artist will be murdered by Islamic fanatics — an action of the same legal and moral legitimacy, and of greater concern to Wikimedia. Wnt (talk) 20:50, 4 June 2010 (UTC)
- (outdent) - http://mydeathspace.com/smf/index.php?topic=14624.0 - Stillwaterising (talk) 00:19, 5 June 2010 (UTC)
- Trust everything you read on the Internet. Of course, the site I linked to is an actual society, and you're linking to a bulletin board where every one is completely anonymous, short of a subpoena. "IFeelBadAboutThis" says that a friend of a friend (no name or SG pseudonym given) posed when she was 17 for SG. That's a very credible and verifiable source, yes siree.--Prosfilaes (talk) 00:50, 5 June 2010 (UTC)
- Graphic evidence: http://community.livejournal.com/sgirls/507238.html - Stillwaterising (talk) 01:50, 5 June 2010 (UTC)
- Ever heard of Photoshop? Even if the screenshot is real, it's evidence that someone typed that she was 17 into a free-form text box. We have no context at all here; it's at least possible it was a joke or to attract attention. I've seen 9/11 Truthers be more critical about their evidence then you.--Prosfilaes (talk) 02:04, 5 June 2010 (UTC)
- Huffington Post: http://www.huffingtonpost.com/2010/03/19/terry-richardsons-predato_n_505708.html - Stillwaterising (talk) 02:58, 5 June 2010 (UTC)
- Oh, that's cute. The Huffington Post is such a responsible news source that they'll tell you she was 17, but forget to mention what the original source[18] says, that she was "armed with "a perfect fake" ID". I'm not sure what anyone can do about that. In any case, one article, from a barely reputable news source, is hardly overwhelming evidence.--Prosfilaes (talk) 03:15, 5 June 2010 (UTC)
- I have presented preponderance of the evidence if not clear and convincing evidence of underage girls modeling for SG. - Stillwaterising (talk) 03:45, 5 June 2010 (UTC)
- Take the case to court instead. Whether or not SG is a responsible group or not is not relevant for this policy. --LPfi (talk) 06:40, 5 June 2010 (UTC)
- @Stillwaterising : this girl doesn't seem to be naked to me ! You have to keep in mind that eroticism isn't porn and that SG doesn't have only nudes, there are also non-nude pictures. Therefore, there could be another rational explanation about this profile if it's real and not photoshoped : this model is a non-nude model. And that is perfeclty legal too. --TwoWings * to talk or not to talk... 07:00, 5 June 2010 (UTC)
- The SG FAQ page I linked to above says you have to be over 18 to model for the site, no exceptions. It also says your application to become a Suicide Girl has to include a fully nude photo. --JN466 18:28, 5 June 2010 (UTC)
- @Stillwaterising : this girl doesn't seem to be naked to me ! You have to keep in mind that eroticism isn't porn and that SG doesn't have only nudes, there are also non-nude pictures. Therefore, there could be another rational explanation about this profile if it's real and not photoshoped : this model is a non-nude model. And that is perfeclty legal too. --TwoWings * to talk or not to talk... 07:00, 5 June 2010 (UTC)
- Take the case to court instead. Whether or not SG is a responsible group or not is not relevant for this policy. --LPfi (talk) 06:40, 5 June 2010 (UTC)
- I have presented preponderance of the evidence if not clear and convincing evidence of underage girls modeling for SG. - Stillwaterising (talk) 03:45, 5 June 2010 (UTC)
- Oh, that's cute. The Huffington Post is such a responsible news source that they'll tell you she was 17, but forget to mention what the original source[18] says, that she was "armed with "a perfect fake" ID". I'm not sure what anyone can do about that. In any case, one article, from a barely reputable news source, is hardly overwhelming evidence.--Prosfilaes (talk) 03:15, 5 June 2010 (UTC)
- Huffington Post: http://www.huffingtonpost.com/2010/03/19/terry-richardsons-predato_n_505708.html - Stillwaterising (talk) 02:58, 5 June 2010 (UTC)
- Ever heard of Photoshop? Even if the screenshot is real, it's evidence that someone typed that she was 17 into a free-form text box. We have no context at all here; it's at least possible it was a joke or to attract attention. I've seen 9/11 Truthers be more critical about their evidence then you.--Prosfilaes (talk) 02:04, 5 June 2010 (UTC)
- Graphic evidence: http://community.livejournal.com/sgirls/507238.html - Stillwaterising (talk) 01:50, 5 June 2010 (UTC)
- Trust everything you read on the Internet. Of course, the site I linked to is an actual society, and you're linking to a bulletin board where every one is completely anonymous, short of a subpoena. "IFeelBadAboutThis" says that a friend of a friend (no name or SG pseudonym given) posed when she was 17 for SG. That's a very credible and verifiable source, yes siree.--Prosfilaes (talk) 00:50, 5 June 2010 (UTC)
- Speedy deletion is (almost) NEVER a tool for that kind of "problem". Speedy should only be used when the case is very clear (obvious copyvio for instance). And that's not the case of pictures of non-porn nudity. As [User:Pieter Kuiper|Pieter Kuiper]] pointed it out perfectly, David Hamilton's case show that it's impossible to be strict and to determine that pictures of nude teenagers are pedopornography (even if they could be considered as erotic in that case). It's the same for Jock Sturges and probably many other cases like. I understand and agree it may be ambiguous with such works but it perfeclty shows that there's some kind of exagerated paranoia whenever nude children or teenagers are shown ! Another recent example of non-porn nude children and teenagers is the current exhibition "Sex A Tell-All" at the Montreal Science Centre, where there's 2 things to meditate : 1) this is a scientific exhibition with no tabu and for 12+ yo people (which means that it's normal to speak or even show sexual acts to teenagers) ; 2) this exhibition features live-size pictures of nude people of many ages, including children and teenagers. Could some people be excited in front of these pictures ? Yes, as with any nude picture if they are conform to the viewer's sexual inclinations. Would that make this museum a supporter of pedophilia ? No. --TwoWings * to talk or not to talk... 06:54, 5 June 2010 (UTC)
- I think some of the above strays from the topic a little. From our point of view at Wikimedia Commons, we can't be expected to recognize the true original source of every uploaded photo and track down innuendo about them on web forums and New Media. If the photo above were sexually explicit material, then (assuming we agree that it's not too sure the subject is over 18) the current policy wording says that the person who uploaded it "will be required to clarify, and possibly evidence, the consent and the ages of all involved." Now this to me seems a compromised wording that may end up being taken to mean a variety of different things, but no interpretation really involves either absolute confidence that the subject is actually over 18, nor gives any great assurance that widely disseminated public domain material won't end up being challenged here. I say this because I just don't believe that a photocopy or scan of an identification card is immune to forgery, and because the better-known and more widely distributed an image is, the harder it will be to contact the original creator for special guarantees. Wnt (talk) 12:36, 5 June 2010 (UTC)
- I fully agree with that Wnt. If there is a problem with a specific model, or a persistent problem with a website, then we will get a report and we will deal with the case, just as we do with copyright violations by NASA or flickr. TheDJ (talk) 14:07, 5 June 2010 (UTC)
- I know and I agree but I thought it was necessary to say true things about that kind of subject (nude underage isn't necessarily illegal). I also want to insist that the argument of appearance ("she looks underage") is almost never valid. --TwoWings * to talk or not to talk... 16:36, 5 June 2010 (UTC)
- For reference, we are using a very similar argument ("participants who appear to be under the age of 25 will be required to clarify, and possibly evidence, the consent and the ages of all involved") in the current draft. --JN466 18:25, 5 June 2010 (UTC)
- Well I'm clearly against that ! How do you define the appearance of a 25yo person ? That doesn't mean anything ! And why 25 ? This is nonsense ! --TwoWings * to talk or not to talk... 09:37, 6 June 2010 (UTC)
- Both are value judgements. Though I prefer the "looks underage" one, simply because it is shorter and doesn't try to set bounds that are impossible to define. TheDJ (talk) 12:47, 6 June 2010 (UTC)
- "Agree with "Why 25?". 18 is young. A girl might have small breasts but that doesn't mean she isn't an adult. Some people even like their porn that way so it is expected that girls of age will slap up cut outs of flowers behind them, wear boy shorts, put in pigtails, take out their piercings and so on to fit in that genre. If the only reason is "she looks young", then we shouldn't suspect that the source is not being honest then there should not be a problem. Blacklist certain sources if they get nailed by the press or authorities but requiring every image of models that look young to pass extra criteria is lame not necessary and puts us on the wrong path.Cptnono (talk) 00:01, 26 June 2010 (UTC)
- Both are value judgements. Though I prefer the "looks underage" one, simply because it is shorter and doesn't try to set bounds that are impossible to define. TheDJ (talk) 12:47, 6 June 2010 (UTC)
- Well I'm clearly against that ! How do you define the appearance of a 25yo person ? That doesn't mean anything ! And why 25 ? This is nonsense ! --TwoWings * to talk or not to talk... 09:37, 6 June 2010 (UTC)
- For reference, we are using a very similar argument ("participants who appear to be under the age of 25 will be required to clarify, and possibly evidence, the consent and the ages of all involved") in the current draft. --JN466 18:25, 5 June 2010 (UTC)
- I know and I agree but I thought it was necessary to say true things about that kind of subject (nude underage isn't necessarily illegal). I also want to insist that the argument of appearance ("she looks underage") is almost never valid. --TwoWings * to talk or not to talk... 16:36, 5 June 2010 (UTC)
- I fully agree with that Wnt. If there is a problem with a specific model, or a persistent problem with a website, then we will get a report and we will deal with the case, just as we do with copyright violations by NASA or flickr. TheDJ (talk) 14:07, 5 June 2010 (UTC)
Several inconsistent recommendations for evaluating whether a Commons upload is child porn
[edit]One problem I see in the current draft is that we tell people a lot of different things that they can do if they find child pornography on Commons, but we don't really say whether all should be done at once, or one then another, or whether it depends on the degree of concern; we don't say for sure whether the standards listed (sort of) defining each apply to the others.
Admittedly I created much of this problem personally by adding the stuff about how a "genuine" case (which someone changed to "suspected") should be reported to the WMF, and that a report that could benefit "exploited children" should go to the NCMEC. I wasn't eager to overload the WMF with reports about anime and such, nor does it seem right to bother NCMEC (whose report page and very name speak of exploited children) with reports of 19th-century photos that obviously can't lead to any rescues.
Whatever the reasons, we now have a clash between telling people to report suspected child porn (i.e. where subjects may or may not be over 18) to the WMF, to NCMEC, and proposing it for speedy or non-speedy deletion (currently there is some debate about which)j.
First, I don't think that we can ever responsibly tell people not to contact the NCMEC or any other law enforcement, because by doing so we could put Wikimedia at risk of being treated like the Catholic church. Everything on this project is public, and the police are part of the public. So the NCMEC link should be at least offered for a person to report to as a first-line response - we can't tell people to ask the WMF and wait before contacting them, so they should never be under a "2)".
Next I think a major concern we should have is that a proposal for deletion could publicize a piece of child porn. For example, if photos of a high school student end up being uploaded here by an ex-lover or an ill-considered prankster, it is possible that Wikimedia Commons might have the only copy on the internet sitting unnoticed on our server. If we act discreetly we might be able to keep it from becoming further distributed. So for any real child porn I think we may be best off telling people to contact WMF and wait for a response before proposing any kind of deletion.
I think this logic affects the deletion/speedy deletion debate for these items. I think those proposing speedy deletion picture it as a first-line mechanism to cover up the content. The problem is that we have all kinds of people watching the speedy requests, perhaps including groups opposed to our organization as a whole, and perhaps including some who actually want to collect child porn. Like rabies breaking out of an endosome, the content could spread from our very attempt to delete it.
The remaining question is what to do with the "looks under 25" stuff. Where did this 25 come from? Does Wikimedia want reports of stuff that looks under 25? If they don't, who is going to demand or accept statements about age? If we do it ourselves, it could involve publicly putting a name to photos that may be of isolated body parts of people who don't want the recognition. Wnt (talk) 16:55, 5 June 2010 (UTC)
- These are good points. In theory, discreet action via WMF or OTRS would be preferable to an 8-week deletion discussion on whether and why something is child pornography or not. However, the COM:OTRS page that we link to in the draft says, "OTRS is currently backlogged and may take a month before someone responds." --JN466 18:40, 5 June 2010 (UTC)
- Hmmm. Is that also true about the contact information page I added? They have phone numbers and snail mail addresses also - are they backlogged? If there's no way for someone to contact WMF about child porn and get an answer this month then we have problems, because someone else might go in the building and turn off the power for them. Wnt (talk) 21:32, 5 June 2010 (UTC)
- The permissions list (which is the one that matters for most of Commons work) is hideously backlogged most of the time (over 500 pending at present). There are a number of OTRS queues, if there isn't an appropriate mail for this issue one could be created I suppose. info-en may be suitable, but as most people on that are en.wp admins and not Commons ones they can't handle directly.
- If there are actual legal problems, there is an email system for that too.--Nilfanion (talk) 21:51, 5 June 2010 (UTC)
- I see that urgent-en is a subqueue of info-en. I'm going to take that to mean that the main e-mail address in the Wikimedia contact page, info@wikimedia.org, is the one that people should contact for urgent matters, with some e-receptionist getting the e-mails sorted that direction pretty fast. I am persuaded now that the "permissions" address currently listed for people complaining about photos of themselves should be replaced with this info@ address, since otherwise it's going to the wrong place and it could be a month until someone knows this. Wnt (talk) 22:05, 5 June 2010 (UTC)
- Is it not possible to use a template similar to the Wiipedia copyvio which blanks the page until someone checks it out ?
- It would be in some ways self regulating as someone who puts them onto lots of images would seen on the list as having an agenda and so be looked at Chaosdruid (talk) 15:02, 11 July 2010 (UTC)
Last sentence of current draft
[edit]The last sentence of the current draft reads, "The deletion of content for any reason does not indicate any community consensus that the content actually is or should be considered obscene." Can we delete or rephrase the sentence? It is not clear to the reader whether we are talking about the reference community in the Miller test, or about the Commons community. If it's the former, it's nonsensical; if it's the latter, then we are implying that our standards are different from those of the general population: neither is desirable. --JN466 19:08, 5 June 2010 (UTC)
- I think our standards are different from the general population. Wikimedia contributors are generally well-read and literate. Consider the difference in point of view between the w:American Library Association and the American legislature, for example. But that's not really relevant; we should just make clear that just because we don't want to or are afraid to host material on the server, that doesn't mean that we're saying it ought to be illegal. Wnt (talk) 21:29, 5 June 2010 (UTC)
- [19]. Does that read okay to you? --JN466 20:39, 7 June 2010 (UTC)
- Well, I'd still leave out "necessarily", because I really don't think that it would ever be an indication that people think something should be obscene, and because in any case it isn't necessary to use it when you're just saying "does not indicate". At least we're down to one word. Wnt (talk) 21:30, 7 June 2010 (UTC)
- [19]. Does that read okay to you? --JN466 20:39, 7 June 2010 (UTC)
Spamming NCMEC?
[edit]I ran into mention of an external thread at w:WP:Content noticeboard, located at [20], where a participant named as stillwaterising posted,
...
- "Here's what I encourage people to do:
- 1. Read and participate developing the Commons:Sexual content guidelines. Policies set by Commons affect ALL projects, not just Commons.
- 2. Report any (on or off-wiki) images of apparent (no proof needed) child pornography/erotica to the NCMEC. This included drawing/illustrations/and cartoons regardless of date of manufacture.
- 3. Report any images found on WMF to the Foundation (here)
- 4. Contact the lawmakers and policy makers (OPTF) and help get the laws changed.
- 5. Last -- stop attacking the messenger! If you want to "go after" somebody go after the person(s) rescpondible for Cock and ball torture (T-H-L-K-D)."
...
Personally, I think that it is a really questionable idea to hit NCMEC with reports that don't actually pertain to "exploited children" who can be identified, investigated, and returned, one hopes alive, to their families. I didn't much dispute the expansion of the wording before because I think that most people think carefully before actually initiating a police report, but stuff like this gets me worried. If Wikipedia Review starts hitting NCMEC with repetitive complaints about drawings and 19th-century pictures of naked children and so on, they risk diverting police resources and leaving real children unprotected. Meanwhile it is possible that Wikimedia Commons could end up being blamed for this waste of resources, leading either to some inappropriate action by irate prosecutors, or real complaints being lost in the noise if true child pornography actually turns up. Wnt (talk) 03:27, 11 June 2010 (UTC)
- Honestly, if WR spams NCMEC to the detriment of actual children, they're going to soon find themselves on the wrong end of some legal penalties, and that's their responsibility. If serious problematic images turn up, then I'd hope WR would inform us so that we can remove them, but it is their perogative to report the issue directly to NCMEC if they want (who in turn, will tell us to remove them, so it just ends up wasting more people's time than it otherwise would have). One cannot base policies on the fear that an irrational third party will take destructive action - that is the very essence of e.g. the Muhammad cartoon controversy. Don't let WR intimidate us. Dcoetzee (talk) 08:23, 11 June 2010 (UTC)
- NCMEC is a clearinghouse accepting reports pertaining to many types of violations of child safety laws. This includes drawings, illustrations, and cartoons regardless of date manufacture. There's no requirement of real children (or teens) being involved. Apprehension of the perpatrators can often prevent future victimization. - Stillwaterising (talk) 01:35, 12 June 2010 (UTC)
- Drawings, illustrations and cartoons are very likely to have artistic value that stops them from being legally obscene, particularly older ones. Wasting their time on completely legal images is likely to piss them off--I hope it does. If you're worried about child safety, I'd go after some of those big hard-backed collections of cartoons; they could kill a small child if the child was hit hard enough with them. I don't know what perpetrator you're planning on stopping by telling them about some 100-year old French drawing.--Prosfilaes (talk) 03:03, 12 June 2010 (UTC)
- Don't you know that even reading Lolita functions like a gateway drug and inevitably leads impressionable Americans to kidnap their neighbor's children for nefarious purposes? Why, the connection could hardly be more direct. Just like readers of C.S. Lewis so often die by foolishly trying to befriend lions.
- There really are children out there being abused (sexually and otherwise). And if NCMEC is wasting its time on this sort of thing instead of on what might actually help them, and if people encourage them to go farther in that direction, how tragic for the actual victims. - Jmabel ! talk 16:07, 12 June 2010 (UTC)
- Drawings, illustrations and cartoons are very likely to have artistic value that stops them from being legally obscene, particularly older ones. Wasting their time on completely legal images is likely to piss them off--I hope it does. If you're worried about child safety, I'd go after some of those big hard-backed collections of cartoons; they could kill a small child if the child was hit hard enough with them. I don't know what perpetrator you're planning on stopping by telling them about some 100-year old French drawing.--Prosfilaes (talk) 03:03, 12 June 2010 (UTC)
- NCMEC is a clearinghouse accepting reports pertaining to many types of violations of child safety laws. This includes drawings, illustrations, and cartoons regardless of date manufacture. There's no requirement of real children (or teens) being involved. Apprehension of the perpatrators can often prevent future victimization. - Stillwaterising (talk) 01:35, 12 June 2010 (UTC)
Desysop of admins that speedy images because of scope
[edit]I have made comments above regarding the speedy deletions because of scope. Now I see that the text includes this:
"Admins making such speedy deletions when it is clear that they are disputed, or restoring content without consensus which they should know to be illegal for Wikimedia to host, are subject to censure at Commons:Administrators' noticeboard/User problems or even Commons:Administrators/De-adminship."
With this addition I can accept speedy deletions because it makes is very clear that if an admin deletes images that was not out of scope too many times then they will be desysopped. I think we should make a note of that on COM:AN because it may come as a surprise to some admins that wrong speedy deletions will result in a desysop. --MGA73 (talk) 17:50, 11 June 2010 (UTC)
- It won't come as a surprise to me it will come as a resignation. If I see a load of bad penis images I delete them - end of story. Assuming that the edit has not been approved by the community it should be removed. --Herby talk thyme 17:57, 11 June 2010 (UTC)
- Well has commnity approved the speedy deletions of nude images because of scope? Official policy is that scope deletions is done in DR's. So I think it is good to have this warning. If the penis images really IS bad then I doubt we will have many sad faces but if some admin deletes ALL new penis images without checking if they are special or better than excisting ones then I think we should consider a desysop. --MGA73 (talk) 18:43, 11 June 2010 (UTC)
- I agree with MGA73. Kameraad Pjotr 18:57, 11 June 2010 (UTC)
- The community only has to say the word and I'll go. The job is a "cleaner" - I've cleaned up around 25,000+ times so far - some I will have got right, some wrong - I am human. --Herby talk thyme 19:19, 11 June 2010 (UTC)
- I see this section more as a safeguard against an admin going "rogue", i.e. deleting controversial images ("the grey area"), where a DR should really be required. Kameraad Pjotr 07:09, 12 June 2010 (UTC)
- The community only has to say the word and I'll go. The job is a "cleaner" - I've cleaned up around 25,000+ times so far - some I will have got right, some wrong - I am human. --Herby talk thyme 19:19, 11 June 2010 (UTC)
- I agree with MGA73. Kameraad Pjotr 18:57, 11 June 2010 (UTC)
- Well has commnity approved the speedy deletions of nude images because of scope? Official policy is that scope deletions is done in DR's. So I think it is good to have this warning. If the penis images really IS bad then I doubt we will have many sad faces but if some admin deletes ALL new penis images without checking if they are special or better than excisting ones then I think we should consider a desysop. --MGA73 (talk) 18:43, 11 June 2010 (UTC)
- There is or should be an equivalent of Ignore All Rules at Commons - i.e., the rules say that images that are out of scope never should be speedy deleted, but if an admin deletes something that would have been deleted anyway I wouldn't necessarily impose sanctions on them. The important thing is that they accept responsibility and possible sanctions if it turns out the image shouldn't have been deleted. Dcoetzee (talk) 23:27, 11 June 2010 (UTC)
- I don't think that people should necessarily be desysopped for it - given a smack with the fish of their choice, sure, but speedying stuff which was clearly going to be deleted anyway isn't a bad thing. -mattbuck (Talk) 23:37, 11 June 2010 (UTC)
- I originally wrote up this section as a bold idea of how to keep the peace, not really expecting it to go unaltered for this long. I agree that before this can become true policy, it needs to be reconciled with COM:SPEEDY. For example, I was envisioning this document eventually becoming a "policy supplement" overall, so this section would become part of policy by virtue of being referenced in COM:SPEEDY. The overall question of whether this is to become policy, guideline, or supplement remains open so far as I know. Earlier discussion mentioned the possibility of a sitewide notice and vote once this proposal is in shape for it, so admins should learn about it in any case. Wnt (talk) 06:19, 12 June 2010 (UTC)
Must be removed until it becomes policy then. Put it in a proposal. Otherwise I am with Dcoetzee --Herby talk thyme 07:34, 12 June 2010 (UTC)
- But this page is a proposal, and will probably still see considerable revision before it goes to a vote. Wnt (talk) 12:38, 12 June 2010 (UTC)
- One thing that does concern me is that deletion discussions sometimes drag on for 4 months before they are closed. We cannot discuss every penis picture for that amount of time; if someone like Herby deletes those that are clearly crap without the interminable bureaucracy, I am all for it. You will always get someone arguing that a picture should be kept; so asking that a deletion should be "undisputed" is perhaps too strong a standard. If an admin really does go "rogue", an RfC would probably do; and if the community consensus is that they have been too enthusiastic about deleting, they can be asked to abide by that RfC's decision, on pain of desysop. --JN466 16:06, 12 June 2010 (UTC)
- There are, in fact, two reasons why DR's stay open for ages. The first and most important is a lack of admins willing to close them. The second is that some deletions are controversial, or have huge consequences. Speedying might do something about the first, but to some users, it will look like you want to prevent them from being discussed, which will lead to flames and the like. Regarding the second reason, there is not much you can do about that, and speedying is certainly not an option for that kind of images. I would suggest to be on the safe side and always open an DR (except in really obvious cases, "admin's discretion" (with severe consequences if abused)). Kameraad Pjotr 21:28, 12 June 2010 (UTC)
- I understand what you're saying. I have no idea what to do about the first problem. As for the second, I would thank any admin who speedily deletes something like File:A American Lady in the park .JPG (it is deleted already, but I assume you can still see it). --JN466 02:06, 13 June 2010 (UTC)
- I must not have written this very clearly. The section that mentions loss of adminship was supposed to refer only to admins who on their own prerogative delete material speedily because they think it is illegal, even though it falls into the criteria that were laid out above for material that shouldn't be speedily deleted, and only if they do so after it becomes apparent that there isn't consensus for the type of deletions they're making. Likewise the de-adminship of people restoring content unilaterally was only if they "should have known it was illegal", which is meant to be a pretty tight standard also. This was meant to keep the dispute within bounds if another dispute like the one in May broke out, but not much more. Is there a way to make this clearer? Wnt (talk) 02:30, 13 June 2010 (UTC)
- JN, either you gave us a bad link or it was oversighted. -mattbuck (Talk) 02:39, 13 June 2010 (UTC)
- Perhaps this will work better: [21]. The DR is here. As it happens, it was closed quickly. --JN466 02:49, 13 June 2010 (UTC)
- I understand what you're saying. I have no idea what to do about the first problem. As for the second, I would thank any admin who speedily deletes something like File:A American Lady in the park .JPG (it is deleted already, but I assume you can still see it). --JN466 02:06, 13 June 2010 (UTC)
- There are, in fact, two reasons why DR's stay open for ages. The first and most important is a lack of admins willing to close them. The second is that some deletions are controversial, or have huge consequences. Speedying might do something about the first, but to some users, it will look like you want to prevent them from being discussed, which will lead to flames and the like. Regarding the second reason, there is not much you can do about that, and speedying is certainly not an option for that kind of images. I would suggest to be on the safe side and always open an DR (except in really obvious cases, "admin's discretion" (with severe consequences if abused)). Kameraad Pjotr 21:28, 12 June 2010 (UTC)
- Comment I doubt a DR for a low quality penis image will be open for 4 months. If there is nothing controversial to a DR it is closed shortly after 7 days. Should a penis be on 2 weeks or even 4 months then it is no disaster. Copyvios and illegal stuff is much worse and if we really care about the best of Commons we should do something about that. We have a lot of bad stuff in Category:Media needing categories and Category:Unidentified logos. So I see no reason why a penis much more importaint than copyvios. So even if we send all penis images to a DR and stop speedy deleting them it would not result in chaos.
- As for the desysop of admins I think there is no reason to "make it more clear". The rule is "You can speedy if it is illegal or a copyvio but don't speedy because of scope". IF an admin thinks (s)he is so good at judging when a DR will end as delete so (s)he chooses to speedy anyway then (s)he must also accept to be desysopped if the judgement is bad in more than a few cases. We should not keep admins that time after time makes bad judgements. If an admin is not 100 % sure it is very easy to start a DR. There is NO reason to speedy images when in doubt. Just start a DR and you will be sure. --MGA73 (talk) 17:48, 13 June 2010 (UTC)
- I disagree, sometimes speedy out of scope is perfectly warranted. There's no point letting crap hang around, clutter up the DRs, create more work for other people and cause the DR page to reach transclusion limit. -mattbuck (Talk) 18:21, 13 June 2010 (UTC)
- As for the desysop of admins I think there is no reason to "make it more clear". The rule is "You can speedy if it is illegal or a copyvio but don't speedy because of scope". IF an admin thinks (s)he is so good at judging when a DR will end as delete so (s)he chooses to speedy anyway then (s)he must also accept to be desysopped if the judgement is bad in more than a few cases. We should not keep admins that time after time makes bad judgements. If an admin is not 100 % sure it is very easy to start a DR. There is NO reason to speedy images when in doubt. Just start a DR and you will be sure. --MGA73 (talk) 17:48, 13 June 2010 (UTC)
- I say "There is no reason to speedy when in doubt and you say you disagree. That does not make sense to me. Why should admins speedy if they are not sure? --MGA73 (talk) 16:20, 16 June 2010 (UTC)
- I'm sure there are backlogs in other, equally (or more) important areas as well. But as for penis pictures, see this deletion request for an unused penis picture, which has been open for more than two months and shows no sign of being closed, and the two just after that, one of which User:Bastique clearly felt should have been speedied (I have no reason to doubt his judgment); yet each attracted more Keep than Delete votes. I'm just saying ... --JN466 21:41, 13 June 2010 (UTC)
- I think this is open for more reasons. First reason is the recent "war" on Commons. I doubt many admins are happy to close DR's on anything nude at the moment. Second reason is that reason to nominate is not very good. There are no links to possible replacements. Third reason is this comment added in the DR "I see only two similar photos of penis (same angle, papules, uncircumcised)" and it tells us that there is no reason to delete and certanly not to speedy. --MGA73 (talk) 21:54, 13 June 2010 (UTC)
- We have an excellent picture of an uncircumcised penis featuring hirsuties papillaris (File:Hirsuties papillaris coronae glandis.jpg), which was a featured image candidate and is used in a dozen projects, and does not feature the photographer's hand clasping his penis ... and as for the other two deletion discussions I pointed to, they were apparently a waste of everyone's time, as the closing admin took no notice of the votes. --JN466 22:19, 13 June 2010 (UTC)
- So the fact that admins give a damn what community thinks and force their own will justify the use of speedy deletions? These two DR's was closed during "the war" where Jimbo said "desys admins that undelete images". Community did not agree that it was junk. It just happens that no one wanted to start a new war to get these two images undeleted. Please note that Jimnbo was later desysopped. I can also find examples of images that was speedied and later undeleted and I can also find examples of images that was kept after a long discussion. So you can not say that DR's are a waste of time. IF the image is really bad it will be deleted soon.
- And yes we have one or two other images that could be used instead. But since when do we have a rule that says we can have only one or two images? This issue is all about that some users want to delete as many images of penises as possible or perhaps even get them banned. I say we do not need special rules for penises. I did a search for "Bill Clinton" on Google and it gave 15,700,000 hits. The word "Penis" gave 50,900,000 hits. The top score "sex" gave 757,000,000 hits. If we use that as an indicator it shows us that we should have a lot more images of penises than we should have of Bill Clinton and even more of sex. So even if we have 1.000 different images of penises I do not think it is a problem as long as it is not "web cam shots".
- As you might notice if you read the policy speedy deletions because of scope is not allowed. What we discuss here if it should be allowed for some types of images. The fact that admins does not follow the excisting policy tells me that we should be carefull not to allow even more deletions. Some admins even deleted images in uses as "not in scope". I argued abowe that we should not allow speedy deletions for perhaps six months to allow a practice to be established. In six months or so when we hopefully have a "standard" on what to keep and what not to keep has formed we could perhaps allow speedy deletions. --MGA73 (talk) 16:16, 16 June 2010 (UTC)
- We have an excellent picture of an uncircumcised penis featuring hirsuties papillaris (File:Hirsuties papillaris coronae glandis.jpg), which was a featured image candidate and is used in a dozen projects, and does not feature the photographer's hand clasping his penis ... and as for the other two deletion discussions I pointed to, they were apparently a waste of everyone's time, as the closing admin took no notice of the votes. --JN466 22:19, 13 June 2010 (UTC)
- I think this is open for more reasons. First reason is the recent "war" on Commons. I doubt many admins are happy to close DR's on anything nude at the moment. Second reason is that reason to nominate is not very good. There are no links to possible replacements. Third reason is this comment added in the DR "I see only two similar photos of penis (same angle, papules, uncircumcised)" and it tells us that there is no reason to delete and certanly not to speedy. --MGA73 (talk) 21:54, 13 June 2010 (UTC)
This thread has diverged from the meaning of the text as it stands (particularly in light of Wnt's comment above). The paragraph in question is firstly about how admins should proceed in cases where they think the work is illegal (ie subject is underage etc), in that they should delete as a precaution and then post it for review. That is an appropriate course of action, as it means the suspected "child porn" is deleted until we confirm its ok. It also tells the admin to stop if their "OMG child porn" instinct is clearly paranoid. It probably should be copy-edited to make that part of it more clear.
The remaining parts of the paragraph, and the substance of this thread, are nothing new: If admins ignore rules and get it right, so what? If they make mistakes, give them a slap (and revert the errors). If they keep on getting it wrong, despite repeated slaps, consider a de-sysop. (This, and the text on the page, is very different from "wrong speedy deletions will result in a desysop"). In the event of a truly rogue admin, I'd imagine there would be an emergency desysop, possibly with stewards warned via IRC, and once the potential for harm is gone it would be reviewed (and the desysopping endorsed or reverted).--Nilfanion (talk) 18:14, 13 June 2010 (UTC)
Brabson v. Florida and certain Commons images
[edit]Commons currently maintains a number of photographs (mostly titled "Körper des Kindes" i.e. bodies of children, filed under Wilhelm von Plueschow) that were taken by a photographer, w:Guglielmo Plüschow, during the more relaxed sexual atmosphere of the Victorian era. Plüschow is apparently rather famous, the topic of some ten books indexed in his Wikipedia article, and his works appear in a public museum and are apparently regarded as having artistic importance.
According to the Wikipedia article he was also a convicted child molester, but in accordance the laid-back sexual mores of the Victorian era he was sentenced to only eight months in jail for "common procuration" and "seduction of minors" before continuing his merry way through Europe.
The photos he produced have survived a large number of deletion proposals, including a brief deletion in the May 2010 purge followed by undeletion. They have generally favored retention.[22] Besides the artistic importance, the photos depict the innocent conduct of Victorian children swimming in the nude, as was typical of the time — you might fairly classify them as "ethnographic nudity".
The issue that comes up in the Brabson v. Florida case, as explained in briefs linked from w:Dost test, is whether a cameraman being a pedophile can make his work child pornography. In that case, a swim coach (oddly enough) devised a plan to take his girls to a special changing area for individual evaluation, in which he had planted a videocamera. The idea seems to be that a film of children changing per se might be "innocent conduct" rather than child pornography, but the fact of him being a pedophile turns it into a "lascivious exhibition of their genitalia" (or not).
This case is currently outstanding in Florida, the jurisdiction where Wikimedia keeps its servers.
Now I recall that earlier Mike Godwin offered to give legal advice where community consensus supported it. I think this may be an example of a case where such advice may be needed. Wnt (talk) 17:30, 13 June 2010 (UTC)
- This is interesting, but I'm not sure what action you're recommending. I'm sure you don't want to delete the work of Plüschow (again). Are you suggesting a change to this proposed policy based on Brabson v. Florida? In my experience, Mike Godwin will comment on particular images but is very reluctant to make meaningful policy recommendations, particularly in areas of legal uncertainty such as this. Dcoetzee (talk) 20:39, 13 June 2010 (UTC)
- I don't want to delete the works of Pluschow, but I don't want Wikimedia to actually get in legal trouble for hosting child pornography either. I would hope that Mr. Godwin will give us some standard we can use to say that this is protected freedom of the press; or barring that, that it is relevant to the upcoming decision and perhaps something that he could file an amicus curiae brief about to try to protect Wikimedia Commons in the decision, citing this example; but at worst, as the Mormons once said, it is better to lose a doctrine than to lose the whole church. Though since we can almost never really know the motives of those taking pictures to put in the public domain, the loss could be substantial. Wnt (talk) 21:13, 13 June 2010 (UTC)
- To be clear, I wasn't suggesting that Godwin comment on anything other than the legal status of the Plueschow images, and whether it could be affected by the Brabson v. Florida decision. Wnt (talk) 16:13, 14 June 2010 (UTC)
- I can understand the need to protect Commons and Wikimedia, of course, but I'm not sure I understand the logic of this. If the argument is that the photographs were taken by a child molester and are therefore, for that reason alone and no other, deemed to be "child pornography", does that mean that pictures such as this one by Pluschow would have to be deleted as well? How can a picture which is clearly not pornographic be judged to be pornographic simply because of the person who took it? This seems illogical, to say the least. And in any case these are historical pictures. Let's say it becomes proven that a famous painter of the Renaissance was actually a child molester - say some old court records turn up - and that his work includes pictures of children; do we have to delete copies of them, if we have them? Anatiomaros (talk) 23:18, 14 June 2010 (UTC)
- I think it's absurd, legally. The main logic behind it is that they want to punish this coach, and either do not have laws on surreptitious changing room filming, or they don't think they're strong enough. (Given that, I doubt this will have any practical effect on historical images.) I think the argument would be that the author of a work changes what it is, a theory that you can load literary crit scholars up on both sides of. I certainly look at these pictures differently knowing that about Plüschow. In any case, I can't see this affecting any work that one couldn't argue that that knowledge pushes it over the line into child porn.--Prosfilaes (talk) 00:07, 15 June 2010 (UTC)
- Even so, according to the defendant's brief,[23] the Second Circuit court in Florida overturned the trial judge's decision to throw the charges out, saying that "It held that the 'lewdness' necessary to make an exhibition of the genitals 'sexual conduct' under 827.071 'may be satisfied by the intent of the person promoting' the child's performance. Before I thought this was December 2009, but I see this brief was actually from January, so perhaps the verdict (case SC09-136) is already in and we can move on. I couldn't find it by simple web search but I don't have access to the elite subscription services - can you? Wnt (talk) 04:04, 15 June 2010 (UTC)
- I think it's absurd, legally. The main logic behind it is that they want to punish this coach, and either do not have laws on surreptitious changing room filming, or they don't think they're strong enough. (Given that, I doubt this will have any practical effect on historical images.) I think the argument would be that the author of a work changes what it is, a theory that you can load literary crit scholars up on both sides of. I certainly look at these pictures differently knowing that about Plüschow. In any case, I can't see this affecting any work that one couldn't argue that that knowledge pushes it over the line into child porn.--Prosfilaes (talk) 00:07, 15 June 2010 (UTC)
- I can understand the need to protect Commons and Wikimedia, of course, but I'm not sure I understand the logic of this. If the argument is that the photographs were taken by a child molester and are therefore, for that reason alone and no other, deemed to be "child pornography", does that mean that pictures such as this one by Pluschow would have to be deleted as well? How can a picture which is clearly not pornographic be judged to be pornographic simply because of the person who took it? This seems illogical, to say the least. And in any case these are historical pictures. Let's say it becomes proven that a famous painter of the Renaissance was actually a child molester - say some old court records turn up - and that his work includes pictures of children; do we have to delete copies of them, if we have them? Anatiomaros (talk) 23:18, 14 June 2010 (UTC)
Shall we take the plunge?
[edit]Perhaps we should present this draft to the community now. While I am sure it is not perfect, no one has edited it for a week or so, and it could probably do with fresh eyes looking at it. Thoughts? --JN466 11:41, 16 June 2010 (UTC)
- I was game for this to be up for public review weeks ago. Tabercil (talk) 12:12, 16 June 2010 (UTC)
- Agreed with that, although I reasonably expect new people to ask for new changes. Dcoetzee (talk) 20:47, 16 June 2010 (UTC)
- I'd expect that too; it might be a fruitful discussion to have at this point. --JN466 21:47, 16 June 2010 (UTC)
- One problem that remains is that it still isn't clear what this is being proposed as. The draft says only "policy, guideline, or process", and the last time it was discussed ("Moving forward on adoption") I don't think there was much agreement as to which. If you call the community in to vote, they'd better have something that they can say yes or no to, or at least choose from a defined list of options. In the meanwhile, perhaps it would be better to recruit people from Village Pump or other forums, or from Wikipedia's Village Pump? Wnt (talk) 00:39, 17 June 2010 (UTC)
- Hmmm... I'd want this to have as much teeth as possible just to try and prevent a repeat of what caused this mess in the first place, so the higher up that chain we go the happier I am. I'd say announce this on the Commons' Village Pump for certain. As for possibly annoucning on similar locations on the busiest sister projects (e.g., EN, DE, ES), I'd say do it if we reckon they'd bring more light than heat to the discussion. Tabercil (talk) 12:24, 17 June 2010 (UTC)
- I think that a simple pointer to this draft at the Village Pump might draw less attendance (it's been mentioned there before, after all) than a proposal for adoption as Commons policy. (I think it has to be a policy, because of the legal ramifications involved.) The announcement should be posted to the Foundation List as well as the Village Pump. --JN466 15:22, 17 June 2010 (UTC)
- One problem that remains is that it still isn't clear what this is being proposed as. The draft says only "policy, guideline, or process", and the last time it was discussed ("Moving forward on adoption") I don't think there was much agreement as to which. If you call the community in to vote, they'd better have something that they can say yes or no to, or at least choose from a defined list of options. In the meanwhile, perhaps it would be better to recruit people from Village Pump or other forums, or from Wikipedia's Village Pump? Wnt (talk) 00:39, 17 June 2010 (UTC)
- I'd expect that too; it might be a fruitful discussion to have at this point. --JN466 21:47, 16 June 2010 (UTC)
- Shouldn't we get this translated, at least into a couple major languages, first?
- Absolutely. - Jmabel ! talk 16:11, 17 June 2010 (UTC)
- Is that customary practice for policy proposals in Commons? --JN466 07:54, 18 June 2010 (UTC)
- I've certainly seen it done before; I'm not sure how customary, but looks like good practice to me. - Jmabel ! talk 17:03, 18 June 2010 (UTC)
- It's just under 3,000 words to translate, which is about 12 hours of work for each language. I don't think that is a good investment, especially as the draft will almost certainly change as soon as it is beginning to be discussed. I'd think it would be more efficient to answer questions about any particular passages that foreign-language editors may be unclear about, or provide translations of such specific passages when called upon. --JN466 00:00, 19 June 2010 (UTC)
- I have to agree with JN here - the policy is not yet stable enough to translate. Machine translation will suffice for now. And yes, it should definitely be proposed as a policy. Dcoetzee (talk) 00:07, 19 June 2010 (UTC)
- It's just under 3,000 words to translate, which is about 12 hours of work for each language. I don't think that is a good investment, especially as the draft will almost certainly change as soon as it is beginning to be discussed. I'd think it would be more efficient to answer questions about any particular passages that foreign-language editors may be unclear about, or provide translations of such specific passages when called upon. --JN466 00:00, 19 June 2010 (UTC)
- I've certainly seen it done before; I'm not sure how customary, but looks like good practice to me. - Jmabel ! talk 17:03, 18 June 2010 (UTC)
- Is that customary practice for policy proposals in Commons? --JN466 07:54, 18 June 2010 (UTC)
- Absolutely. - Jmabel ! talk 16:11, 17 June 2010 (UTC)
(outdent) We need to go ahead with this as soon as possible. - Stillwaterising (talk) 16:39, 25 June 2010 (UTC)
- Agreed, with WMF's investigator on the prowl we need to have policy in place, soon, not just proposed. I will take the liberty of inviting further comment in multiple locations on Commons and En - and I'll ask others to recruit interested parties on their own local wikis for review. Dcoetzee (talk) 22:52, 25 June 2010 (UTC)
- I've dropped a mail to the foundation list as well. --JN466 11:54, 27 June 2010 (UTC)
- What about a mail on commons-l and otrs-l as well that'll get to a broad cross section of people Gnangarra 13:03, 27 June 2010 (UTC)
- Good idea. Could you send one out? --JN466 13:47, 27 June 2010 (UTC)
- What about a mail on commons-l and otrs-l as well that'll get to a broad cross section of people Gnangarra 13:03, 27 June 2010 (UTC)
- I've dropped a mail to the foundation list as well. --JN466 11:54, 27 June 2010 (UTC)
'Sexual content uploaded without the consent of the participants'
[edit]How do we confirm consent? Apologies if this is asked and answered elsewhere - but I think it's vital to clarify in the proposal / policy page as one of the central changes proposed. Privatemusings (talk) 02:26, 17 June 2010 (UTC)
- The usual OTRS process, I would presume. But, really, it seems most important to me that photographers have proof of consent to provide if challenged. - Jmabel ! talk 16:12, 17 June 2010 (UTC)
- I added text to this effect, to clarify. Dcoetzee (talk) 21:54, 17 June 2010 (UTC)
- I would support the policy requiring evidence (or assertion at a minimum) of consent at time of upload - if we're going to require such assertion or evidence only 'if challenged' then we need to clarify whether or not such a challenge requires any basis (if I ask for confirmation, will 'we have no reason to believe consent wasn't given' or 'looks like they give consent to me' be valid arguments against requiring confirmation or assertion of consent?) - if a challenge requires no basis (which seems much better to me, I feel the onus should be on the uploader) then it's my feeling that it's a better idea to mandate this at the point of upload - I feel we'd be creating a drama engine otherwise. Privatemusings (talk) 05:21, 18 June 2010 (UTC)
- I added a phrase about that. I wonder however about "demonstration" of consent: should an OTRS mail from a hotmail account qualify? As said earlier, we cannot prove that consent is given, unless we are ready to verify signatures on possibly photoshoped consent forms against possibly false id documents from Anywherestan. I suppose we have to trust the uploader when there is no reason for doubt, given (s)he clearly says consent is given. --LPfi (talk) 05:39, 18 June 2010 (UTC)
- My feeling is that we should not diverge from Commons:Photographs of identifiable people on this point. There are many potentially embarrassing personal photos that can be taken other than explicit demonstrations of sexual anatomy. Reading that policy, I'm really not at all sure what they have in mind, but I don't want to change that status quo, whatever it is, only for sexual content - that would be unnecessarily complicated and confusing. The way I would guessingly interpret that policy is that we take the upload on good faith; but if someone has a complaint they can contact WMF as explained at COM:OFFICE and generally get the photo suppressed, unless it was clearly taken in a public place. Wnt (talk) 22:39, 18 June 2010 (UTC)
- Thought I would throw an example out there that I came across. There was an image uploaded to commons from flickr that showed a girl sucking on her partner's testicles that could have worked for the Wikipedia article on Teabagging. However, the original uploader (to flickr not commons) was a dead account and the question arose that it could have just been an ex-boyfriend slapping the creative commons license on so everyone could see her in action. There was no reason to assume the best or the worst but it was a big enough concern that I did not feel completely comfortable using it after some thought. There was no way I could see to track down if there was consent or not. Would this be an appropriate instance to require something demonstrating consent? Depending on the answer, does the current wording convey that?Cptnono (talk) 00:32, 26 June 2010 (UTC)
- My feeling is that we should not diverge from Commons:Photographs of identifiable people on this point. There are many potentially embarrassing personal photos that can be taken other than explicit demonstrations of sexual anatomy. Reading that policy, I'm really not at all sure what they have in mind, but I don't want to change that status quo, whatever it is, only for sexual content - that would be unnecessarily complicated and confusing. The way I would guessingly interpret that policy is that we take the upload on good faith; but if someone has a complaint they can contact WMF as explained at COM:OFFICE and generally get the photo suppressed, unless it was clearly taken in a public place. Wnt (talk) 22:39, 18 June 2010 (UTC)
- I added a phrase about that. I wonder however about "demonstration" of consent: should an OTRS mail from a hotmail account qualify? As said earlier, we cannot prove that consent is given, unless we are ready to verify signatures on possibly photoshoped consent forms against possibly false id documents from Anywherestan. I suppose we have to trust the uploader when there is no reason for doubt, given (s)he clearly says consent is given. --LPfi (talk) 05:39, 18 June 2010 (UTC)
- I would support the policy requiring evidence (or assertion at a minimum) of consent at time of upload - if we're going to require such assertion or evidence only 'if challenged' then we need to clarify whether or not such a challenge requires any basis (if I ask for confirmation, will 'we have no reason to believe consent wasn't given' or 'looks like they give consent to me' be valid arguments against requiring confirmation or assertion of consent?) - if a challenge requires no basis (which seems much better to me, I feel the onus should be on the uploader) then it's my feeling that it's a better idea to mandate this at the point of upload - I feel we'd be creating a drama engine otherwise. Privatemusings (talk) 05:21, 18 June 2010 (UTC)
- I added text to this effect, to clarify. Dcoetzee (talk) 21:54, 17 June 2010 (UTC)
I don't think PrivateMusing's queries have been adequately addressed yet. 99of9 (talk) 12:35, 4 July 2010 (UTC)
FPC etc
[edit]There is discussion at the FPC talk page with respect to sexually explicit content, specifially File:Futanari.png, which has been nominated for FPC, QIC, VIC and (German) de:WP:KEB. All 4 forums have received the image poorly due to the nature of the content. From that it appears (and this is my personal opinion too) that whilst Commons may host this type of material, it should not showcase it and it should be excluded from these processes. It may be appropriate to write something along those lines into this guideline.--Nilfanion (talk) 08:52, 19 June 2010 (UTC)
- But how will other pictures be handled? I uploaded the image File:Hadako-tan.png a while ago and did not include it in any article, since i was still not finished. But since this time it found its way in many articles (especially Hentai), without me doing anything. Now we could ask the question: Isn't it valuable? Does it have no quality? Why can't it be promoted? Why we re showcasing war-machinery in FP, where people might die while the image was taken? Many question that will need an answer. --Niabot (talk) 09:10, 19 June 2010 (UTC)
- The inability of review processes to give due consideration to a sexual image represents a knee-jerk reaction to sexual content by the people who operate them, and is a flaw in those processes that should be corrected. However, that might be a long road. In the meantime, I think people should be free to nominate such images - who knows, maybe one will be good enough to get past the negative initial reaction and get reviewed. Dcoetzee (talk) 10:50, 19 June 2010 (UTC)
- If FPC wants to exclude sexual content, then let it; but I think we're better off letting the individual groups that run these things to handle it, instead of writing it into stone here.--Prosfilaes (talk) 16:35, 19 June 2010 (UTC)
- In any event, I'd want to keep these off the front page. This sort of image could get someone in a lot of trouble if it came up on their computer at work, in school, or in a library. This is not the sort of image anyone should stumble upon accidentally. - Jmabel ! talk 17:08, 19 June 2010 (UTC)
- I disagree with this sentiment. If a school, workplace, or library has a bizarre policy allowing a person to read open forums, such as FPC, but then penalizes them if some sexual image pops up, then they are walking around with a wick trailing behind just waiting for someone to light it. It's not reasonable for every site on the internet to censor itself to some unknown standard in order to avert such a hypothetical penalty. If we go down that road, for example, then should we ban the proposal for FPC of swastika-bearing historical images that are banned in Germany? If not, it is a deliberate affront to discriminate against German located viewers; but if we accept this principle, then we end up censoring not only according to the laws of every country where a reader might reside - but according to their unspoken mores and all the things that "might get someone in trouble". This idea is generally at odds with all precedent, which has consistently opposed censorship and games involving hidden images. Wnt (talk) 01:49, 22 June 2010 (UTC)
- Certainly not every image banned along these lines in Germany is inappropriate for the front page, but I would hesitate, for example, to run the famous "Deutschland Erwacht!" poster on the front page even though a good reproduction might merit featured status. - Jmabel ! talk 05:05, 22 June 2010 (UTC)
- It isn't quite as simple. Germany does forbid this symbols by law, but there are exceptions for educational purposes. As long it isn't used to euphemize and used in the right context, it also has no problems in germany. Same goes for pornographic or brutal illustrations/artwork/photographs/etc. --Niabot (talk) 07:27, 22 June 2010 (UTC)
- I disagree with this sentiment. If a school, workplace, or library has a bizarre policy allowing a person to read open forums, such as FPC, but then penalizes them if some sexual image pops up, then they are walking around with a wick trailing behind just waiting for someone to light it. It's not reasonable for every site on the internet to censor itself to some unknown standard in order to avert such a hypothetical penalty. If we go down that road, for example, then should we ban the proposal for FPC of swastika-bearing historical images that are banned in Germany? If not, it is a deliberate affront to discriminate against German located viewers; but if we accept this principle, then we end up censoring not only according to the laws of every country where a reader might reside - but according to their unspoken mores and all the things that "might get someone in trouble". This idea is generally at odds with all precedent, which has consistently opposed censorship and games involving hidden images. Wnt (talk) 01:49, 22 June 2010 (UTC)
- We may as well consider that the inability of understanding the use of self-censorship is a knee-jerk reaction that should be corrected. In the present case, we believe that the deference and respect due to FPC users should prevail over the abstract Wikimedia's 'freedom of content'. FPC has a strong educational component and is supposed (until now, at least) to be 'family safe' and 'work safe'. Yes, I agree that this issue should be handled by the FPC regulars. -- Alvesgaspar (talk) 17:37, 19 June 2010 (UTC)
- One possible compromise is to have an alternate process for featuring high-quality sexual content, with a separate portal where it is displayed. FPC is certainly entitled to use any criteria it likes in its selection - contributors do have plenty of other motivations besides getting featured. Dcoetzee (talk) 23:12, 19 June 2010 (UTC)
- Any idea that involves us segregating "adult content" in any way on Commons -- whether that means having a separate Commons site for adult content, or just separate FPC processes for adult content -- puts us between a rock and a hard place. The advantage is that adult content would be easy to block for those who wish to do so, providing an easy path to making Commons family-safe. The downside is that a separate FPC process for adult content would be viewed as "Wikimedia's porn competition", and that a separate Wikimedia site for adult content would become "Wikimedia's shock site and porn server" and attract a corresponding audience; and it seems likely that the scope for such a separate undertaking would gradually become more inclusive than that presently defined for Commons. --JN466 12:42, 23 June 2010 (UTC)
- One possible compromise is to have an alternate process for featuring high-quality sexual content, with a separate portal where it is displayed. FPC is certainly entitled to use any criteria it likes in its selection - contributors do have plenty of other motivations besides getting featured. Dcoetzee (talk) 23:12, 19 June 2010 (UTC)
- In any event, I'd want to keep these off the front page. This sort of image could get someone in a lot of trouble if it came up on their computer at work, in school, or in a library. This is not the sort of image anyone should stumble upon accidentally. - Jmabel ! talk 17:08, 19 June 2010 (UTC)
- My honest thought is this: If we're going to uphold Commons:Commons is not censored, then we should uphold it. There's nothing wrong with allowing basic courtesies, like putting sexual images in such fashion:
Click for image: Image contains sexual content |
---|
The following is an archived debate. Please do not modify it. |
- ...But I think they ought to be allowed to go through processes. We can discuss the mainpage issue separately, but forbidding them entirely seems against Commons' principles.
- Also, note that it would not be difficult to work in a hide-the-image option to PotD. Adam Cuerden (talk) 12:52, 23 June 2010 (UTC)
- If you are interested in the option of collapsing images, this was recently discussed (in the context of en:WP) here and here. --JN466 13:43, 23 June 2010 (UTC)
- First, since the image appears to portray minors engaged in sexual activity it should be speedy deleted, oversighted, and reported to NCMEC. Doing so it makes WMF immune to prosecution. Failure to do so is "Failure to Report Sexual Images of Apparent Minors" and will result in a fine of up to $150,000 (for first offense). Intentional promotion or reside of the image further enhances the crime into a federal misdemenor. Another legal defense is to claim that the page is under the control of the uploader and the foundation was unaware of it. This is the same defense that could keep WMF from being considered a secondary producer and therefore not responsible for keeping 2257 records. The only other arguement is that the image has serious artistic value may or may not be accepted by a jury, however it could still easily be prosecuted and result in large legal costs and enormous negative publicity. This would be accompanied by loss of donations as well. Under the precautionary principle, we shouldn't be taking such risks. No sane webmaster would allow such an image, why should we? - Stillwaterising (talk) 05:02, 24 June 2010 (UTC)
- I've said it multiple times to you. You are not a lawyer. Your interpretation is not more valid than that of another user. Your crusade against "porn" is becoming quite laughable. Kameraad Pjotr 09:28, 24 June 2010 (UTC)
- I have no idea how on Earth a person is supposed to decide that Futanari.png involves "apparent minors". They have breasts... and penises... beyond that, what can you measure? Maybe you think their hairstyles follow a traditional "schoolgirl" fashion, but they're following a Japanese-ish style, and what do I know of how women wear their hair in Japan? And am I supposed to accept that women can't wear their hair in a schoolgirl fashion if they feel like it? It's all very silly.
- What I can say for sure is that even a very brief and somewhat nervous journey through the /b/ section of the widely known Web site w:4chan.org will lead to encounters with things that seem a lot more at risk of stepping over this and several other blurry lines than anything here. I'm a bit surprised that they haven't run into trouble, since they have a large number and controversial assortment of unmoderated "porn" uploads directly visible to readers. But fortunately they haven't, and the fact of their existence provides some evidence that Wikimedia Commons is nowhere near the boundary of what can go on in the U.S. Wnt (talk) 19:49, 24 June 2010 (UTC)
- 4chan is a notice board. The owners are not resposible for what users post. They ARE responsible for removing all content that has been been reported as child porn however and reporting it to NMEAC as are we. Failure to report is a fine not to exceed $150,000 first offense and $300,000 for subsequent offences. Recently, Tagged.com has been sued for violating this provision. - Stillwaterising (talk) 12:28, 25 June 2010 (UTC)
- Your statement bears no resemblance to the law as it it written or as it is interpreted by the courts. As such, it should be ignored as irrelevant. --Carnildo (talk) 20:03, 25 June 2010 (UTC)
- This edit shows, that this user has no knowledge at all, but speaks like he is the last rule. ;-) --Niabot (talk) 21:14, 25 June 2010 (UTC)
- Carnildo, child porn law is a threeway interaction between Congress (legislative), law enforcement (executive) and the courts (judicial branch). This is complicated by the fact that the DOJ (executive beach) has been granted authority by Congress to write statutes as they see fit, like 2257A. Claiming that a user is delusional without solid proof is a typical troll tactic. Where's your proof? - Stillwaterising (talk) 22:21, 25 June 2010 (UTC)
- You can't proof, we can't proof it, like usual. But as far i know, any court has a better understanding in this case, as you showed us so far. You always cry that anything is child porn. But as you can see in your own references (real children, 5 years old, ...) it is far away from anything that is currently available on commons and is used in a way different context. --Niabot (talk) 22:54, 25 June 2010 (UTC)
- Regarding to one of your recent edits. What made you so sure, that File:Anime Girl.png would had to be categorized as lolicon? Hundreds, if not even thousands of people have seen this image, and none had ever the feeling to mark it as child porn or lolicon. Why would you do it? Any good reason? --Niabot (talk) 23:01, 25 June 2010 (UTC)
- Carnildo, child porn law is a threeway interaction between Congress (legislative), law enforcement (executive) and the courts (judicial branch). This is complicated by the fact that the DOJ (executive beach) has been granted authority by Congress to write statutes as they see fit, like 2257A. Claiming that a user is delusional without solid proof is a typical troll tactic. Where's your proof? - Stillwaterising (talk) 22:21, 25 June 2010 (UTC)
- The Huffington Post is a pretty darn unreliable source, and you'll note that besides the header, even they don't say that Tagged.com was sued for violating this provision; the article says "The office notified Tagged.com that it would sue the site if the problems were not resolved within 5 days". This after finding explicit photos of children under 5 engaged in sexual acts.--Prosfilaes (talk) 23:04, 25 June 2010 (UTC)
- This thread has wildly diverged from its original topic (which was not all that important in the first place). Let's not fight - our focus here is to come together with a policy we all agree on that will help direct our inclusion and handling of sexual content, and it's important that we do so soon with the Board starting to take action of its own accord. I've solicited opinions in several locations and we'll hopefully get some fresh feedback soon. Dcoetzee (talk) 23:51, 25 June 2010 (UTC)
- 4chan is a notice board. The owners are not resposible for what users post. They ARE responsible for removing all content that has been been reported as child porn however and reporting it to NMEAC as are we. Failure to report is a fine not to exceed $150,000 first offense and $300,000 for subsequent offences. Recently, Tagged.com has been sued for violating this provision. - Stillwaterising (talk) 12:28, 25 June 2010 (UTC)
- I've said it multiple times to you. You are not a lawyer. Your interpretation is not more valid than that of another user. Your crusade against "porn" is becoming quite laughable. Kameraad Pjotr 09:28, 24 June 2010 (UTC)
- First, since the image appears to portray minors engaged in sexual activity it should be speedy deleted, oversighted, and reported to NCMEC. Doing so it makes WMF immune to prosecution. Failure to do so is "Failure to Report Sexual Images of Apparent Minors" and will result in a fine of up to $150,000 (for first offense). Intentional promotion or reside of the image further enhances the crime into a federal misdemenor. Another legal defense is to claim that the page is under the control of the uploader and the foundation was unaware of it. This is the same defense that could keep WMF from being considered a secondary producer and therefore not responsible for keeping 2257 records. The only other arguement is that the image has serious artistic value may or may not be accepted by a jury, however it could still easily be prosecuted and result in large legal costs and enormous negative publicity. This would be accompanied by loss of donations as well. Under the precautionary principle, we shouldn't be taking such risks. No sane webmaster would allow such an image, why should we? - Stillwaterising (talk) 05:02, 24 June 2010 (UTC)
- If you are interested in the option of collapsing images, this was recently discussed (in the context of en:WP) here and here. --JN466 13:43, 23 June 2010 (UTC)
A tag for 2257?
[edit]The present understanding is that, according to Mike Godwin, WMF has no obligation to comply with 2257. As usual, Mike's job is just to look out for WMF's interests, not those of our content reusers. We want to be careful to warn commercial content reusers about their possible 2257 obligations as a secondary producer, particularly in the case where we have no 2257 documentation on file. This could be done with a tag, similar to the {{Personality}} tag, called perhaps {{2257}}. What do you guys think? Dcoetzee (talk) 11:00, 19 June 2010 (UTC)
- Sounds like a good idea to me. Likely it won't end up everywhere it should (just like {{Personality}}), but anything's a start. VernoWhitney (talk) 13:55, 19 June 2010 (UTC)
- Ditto. My only concern would be that there might be liability issues if some images got it and others that needed it didn't, (and hopefully it's an invalid concern!) but that's a secondary consideration. Someone want to take a shot at the template? ++Lar: t/c 01:00, 23 June 2010 (UTC)
- I've now created {{2257}}. Please review it. If you guys are okay with it, we should probably start applying it to images right away. Dcoetzee (talk) 22:00, 24 June 2010 (UTC)
- I like it. Well done. We might perhaps consider adding something about that we do not guarantee (nor have a legal obligation to do so), that the original uploader is able/present to provide these records, but that reusers are always welcome to ask uploaders. I don't know. it's a bit technical I guess. Perhaps just link to a page that explains the process for reusers into more detail? TheDJ (talk) 00:37, 25 June 2010 (UTC)
- Nice work. One nit... change "actual human beings" to "one or more actual human beings" as I don't think there is a requirement that 2 people be in the picture, is there? Thoughts? ++Lar: t/c 01:58, 25 June 2010 (UTC)
- One of the quirks of the English language is that it is correct to use the plural form when the number of subjects is unknown. --Carnildo (talk) 09:07, 25 June 2010 (UTC)
- It's a good start. I had advocated a tag be created in the April 2010 proposal. I think another tag should be made for images created before Nov 1, 1990, and another made similar to Template:Creator to contain recordholder information which includes name of record holder and address to supplement 2257 tag. Will the plural of "human beings" cause confusion when translated? Should it be changed to "human being(s)" instead? - Stillwaterising (talk) 12:07, 25 June 2010 (UTC)
- One of the quirks of the English language is that it is correct to use the plural form when the number of subjects is unknown. --Carnildo (talk) 09:07, 25 June 2010 (UTC)
- Nice work. One nit... change "actual human beings" to "one or more actual human beings" as I don't think there is a requirement that 2 people be in the picture, is there? Thoughts? ++Lar: t/c 01:58, 25 June 2010 (UTC)
- I like it. Well done. We might perhaps consider adding something about that we do not guarantee (nor have a legal obligation to do so), that the original uploader is able/present to provide these records, but that reusers are always welcome to ask uploaders. I don't know. it's a bit technical I guess. Perhaps just link to a page that explains the process for reusers into more detail? TheDJ (talk) 00:37, 25 June 2010 (UTC)
- I've now created {{2257}}. Please review it. If you guys are okay with it, we should probably start applying it to images right away. Dcoetzee (talk) 22:00, 24 June 2010 (UTC)
- Ditto. My only concern would be that there might be liability issues if some images got it and others that needed it didn't, (and hopefully it's an invalid concern!) but that's a secondary consideration. Someone want to take a shot at the template? ++Lar: t/c 01:00, 23 June 2010 (UTC)
- I've identified 70 images so far that this tag applies to, shown at Category:Child Protection and Obscenity Enforcement Act warning. Feel free to double-check me on these. @Stillwaterising, I do like the idea of having a (optional) tag containing contact information for the recordholder - this will save anyone interested in reusing these images a lot of trouble tracking down the records. Might possibly make it a parameter of the {{2257}} tag. I don't see the need however for a tag for images created before Nov 1, 1990; to my knowledge there are no legal restrictions on distribution of these images in the US, am I missing something? Dcoetzee (talk) 15:11, 25 June 2010 (UTC)
- Is there a need to have a category for this? It basically creates a huge gallery of sexually explicit images, which in other contexts people have complained about, and the "What links here" for the template would seem to do the same thing. Wnt (talk) 15:19, 25 June 2010 (UTC)
- It is not necessary; but it is conventional for warning templates to have associated categories. As you say, the "what links here" does the same thing, so it doesn't really hide anything to omit the category. It is worth noting that a large number of these images have showed up at Requests for undeletion. Dcoetzee (talk) 15:23, 25 June 2010 (UTC)
- The pre-2257 tag would say that this image was proven to have been created before Nov 1, 1990 and is therefore exempt from 2257 record keeping requirements. The best solution to the problems related to 2257, as suggested to me in an in-depth conversation with notable attorney Jeffrey Davis (he represented Max Hardcore and JM Productions in their obscenity trials) was to either prohibit all explicit images or only allow explicit images that were made before the 1990 date. The community seems to strongly reject outright prohibition, however the later seems like a reasonable compromise. - Stillwaterising (talk) 16:15, 25 June 2010 (UTC)
- [Fixed a couple typos in your reply] The community also strongly rejects the idea of rejecting all explicit images created after the 1990 date. I'm still pondering whether a tag indicating an explicit image older than the 1990 date would be useful - it would be a bit like the Freedom of panorama or de minimis templates, indicating a lack of a restriction where one ordinarily would be found. Maybe it could be called {{No-2257}} or {{Not-2257}}? Dcoetzee (talk) 17:45, 25 June 2010 (UTC)
- It serves a useful function. The potential reuser can see the tag and know the image is safe for reuse. 2257 still applies, and the reuser would need to include a statement that the image is older than the provision and provide proof if inspected by an assistant attorney general. I think {{pre-2257}} or {{pre2257}} would be appropiate. - Stillwaterising (talk) 22:33, 25 June 2010 (UTC)
- [Fixed a couple typos in your reply] The community also strongly rejects the idea of rejecting all explicit images created after the 1990 date. I'm still pondering whether a tag indicating an explicit image older than the 1990 date would be useful - it would be a bit like the Freedom of panorama or de minimis templates, indicating a lack of a restriction where one ordinarily would be found. Maybe it could be called {{No-2257}} or {{Not-2257}}? Dcoetzee (talk) 17:45, 25 June 2010 (UTC)
- The pre-2257 tag would say that this image was proven to have been created before Nov 1, 1990 and is therefore exempt from 2257 record keeping requirements. The best solution to the problems related to 2257, as suggested to me in an in-depth conversation with notable attorney Jeffrey Davis (he represented Max Hardcore and JM Productions in their obscenity trials) was to either prohibit all explicit images or only allow explicit images that were made before the 1990 date. The community seems to strongly reject outright prohibition, however the later seems like a reasonable compromise. - Stillwaterising (talk) 16:15, 25 June 2010 (UTC)
- It is not necessary; but it is conventional for warning templates to have associated categories. As you say, the "what links here" does the same thing, so it doesn't really hide anything to omit the category. It is worth noting that a large number of these images have showed up at Requests for undeletion. Dcoetzee (talk) 15:23, 25 June 2010 (UTC)
- Is there a need to have a category for this? It basically creates a huge gallery of sexually explicit images, which in other contexts people have complained about, and the "What links here" for the template would seem to do the same thing. Wnt (talk) 15:19, 25 June 2010 (UTC)
Board directs WMF to conduct a study about objectionable content
[edit]Moved to Commons:Village pump#Board_directs_WMF_to_conduct_a_study_about_objectionable_content
Fisting, urination, and vomiting
[edit]Should there be a line clarifying that depictions of "Fisting, urination, and vomiting" are not necessarily obscene? I know this sounds silly but sometimes those requesting deletion latch on to particular phrases while ignoring the rest of the passage. For example, this is not "hardcore" according to my understanding of the term (I believe masturbation is softcore but am not sure). The creator chose to not portray two people (maybe an attempt to soften it?).w:User talk:Seedfeeder#Fisting illustration Also, some people find images extremely useful in learning depending on their style which might be considered "some redeeming social value". It still may not be acceptable but "not necessarily" would be a nice addition to a clarifying line since I assume there will be instances of a case by case basis.Cptnono (talk) 23:52, 25 June 2010 (UTC)
- Sure, I'd agree with that. Something to the effect of "this should not be taken to indicate that any depiction of these topics would be considered obscene." Dcoetzee (talk) 00:08, 26 June 2010 (UTC)
Reformatting
[edit]I've made some minor reformatting changes that I hope are noncontroversial - mostly involving cleaning up the messy section on "other legal considerations" and moving the notice about contacting the WMF for permanent deletion higher up. Let me know if there are any issues with this. Dcoetzee (talk) 00:10, 26 June 2010 (UTC)
speedy deletion criteria
[edit]Sexual content may generally be speedy deleted only if it is clear that it does not fall into at least one of the following classifications and is not excluded by section above
- 6. The specific content or the content's creator is notable.
A piece that has failed the Miller test isnt excluded by the section above wording as obscene works arent covered until the end of the page. I think this criteria needs a qualifier because one could argue that the author of work which has been classed as obscene under the miller test meets notability. One could also argue that the work its self is a notable piece and then at the very least we have a two deletion discussion where the work is still accessable. Alternatively failing a Miller test needs to be included in the Prohibited Content section. Gnangarra 05:03, 26 June 2010 (UTC)
- Well, yes, if a work has been ruled obscene, then it could well be notable enough for us to keep a copy around. Obscenity is neither a simple yes-or-no binary nor an unchangable status: if an artwork ruled "obscene" has a Wikipedia article, then it is possible that it's no longer obscene under the Miller test. That's why we need to have a deletion discussion about it. --Carnildo (talk) 08:44, 26 June 2010 (UTC)
- The thing is if a piece has been found to be in violation of [US] federal obscenity law would mean we have to delete it, as soon as we are aware it violates US federal laws. Gnangarra 15:05, 26 June 2010 (UTC)
- Comment I tryed to make it more clear that we can always speedy if copyvio or illegal. In addition to that a speedy may be accepted if file is blatantly out of scope. --MGA73 (talk) 16:38, 26 June 2010 (UTC)
- the material might have subsequently become of literary, artistic, political, or scientific value . (as I understand it, this exemption seems pretty clear in the US law. DGG (talk) 19:37, 27 June 2010 (UTC)
Is it really a good idea to send documents to OTRS?
[edit]I see now that the suggestion to file documents has grown to: "Although not required, it is good practice to forward age and identity documentation for sexual content, date of photography, and statement(s) of consent by the person(s) depicted, to our mail response team (permissions-commons-at-wikimedia.org) to help avoid any later problems.[16]"
What worries me is that OTRS is not exactly Fort Knox - information like this could get out, especially in some controversial case or if the model is just very attractive. We could end up with privacy and so-called "identity theft" (fraud) issues thrown into the mix. And if the information isn't actually public, how much use (if any) is it to a would-be commercial reuser? Should Wikimedia Commons be going through such trouble and risk just to make it easier for an American website to reuse our educational sexual content in a plain porn collection?
I think we should keep our hands off this information altogether. Perhaps say something more like: "Although not required, it is a good practice for American photographers who would like to encourage the commercial reuse of their work in the United States to provide a link or contact information where reusers can obtain age and identity documentation for sexual content, date of photography, and statement(s) of consent by the person(s) depicted."
Wnt (talk) 17:48, 26 June 2010 (UTC)
- One rule fits all contributers, anyone who publishes can be investigated by US authorities who share information with other juristrictions. The photographer is required to hold this information and make it available to the appropriate authorities as/when requested. What would be better is that we require the photographer to identify themselves to the Foundation, like we do for arbitrators, otrs agents, check users etc. Suggest something like
- Uploaders are advised to familiarise themselves with their legal obligations before uploading sexual content. Commons is under no obligation to keep records on the age and identity of models shown in media depicting sexually explicit conduct.[15] However, editors who have produced such media may have record-keeping obligations.[14] It is recommended that uploaders of the content identify themselves to the Foundation by email. This recommendation applies only to works made after November 1, 1990 that depict actual human beings engaged in sexually explicit conduct; documentation is unnecessary for illustrations or for old photographs proven to be produced before this date. With respect to "depictions of actual sexually explicit conduct consisting of only lascivious exhibition or depictions of simulated sexually explicit conduct," 18 U.S.C. § 2257A record keeping regulations apply only to works originally produced after March 18, 2009.[17]
- That keeps the whole paragraph in the same language, retains the privacy of the model but clearly places the onus on the photographer to keep such records. Where a reuser needs to have information they would contact the Foundation who can the refer them to the uploader. Gnangarra 01:57, 27 June 2010 (UTC)
- Correct me if I'm wrong, but I don't think that a foreign porn website is in any way required to keep 2257 records even though Americans view it. If such a photographer then uploaded material to Commons, it would simply be a copy of a foreign photograph, and there still should be no expectation that such a record exists. And even an American photographer... well... I'm not sure that I can "recommend" for him to file an e-mail address if the purpose is to help send a crew of people to his door to inspect his records... Wnt (talk) 16:51, 27 June 2010 (UTC)
In my opinion the whole section is useless, because I have serious doubts that anyone is actually ever gonna do this. Making requests that no one is gonna listen to are a waste of time I think. TheDJ (talk) 18:05, 27 June 2010 (UTC)
- I've made a fairly drastic change here as a counter-proposal, which reflects my opinion that we're best off not touching these documents. We're not (we hope) required to keep them, we're not (by intent at least) a pornography site, and the producers (if they made such records and if they are actually subject to this) are required to keep them available for inspection anyway. Wnt (talk) 02:53, 28 June 2010 (UTC)
- Good work Wnt, I think this is much more useful. TheDJ (talk) 15:03, 28 June 2010 (UTC)
- Despite some initial surprise, I think this is a positive change. We don't want or need to manage these records ourselves - we just need to get content reusers in contact with record keepers. Dcoetzee (talk) 15:17, 28 June 2010 (UTC)
- Exactly. I understand why people were considering it, but the nature of privacy in OTRS (it aint a vault) and the simple impracticality of doing this doesn't make it realistic. The other argument was "what if we change this in the future". Well I guess then we are screwed and will lose a lot of content, but so be it, because I doubt it would have worked, so we would lose most of that content anyways. TheDJ (talk) 16:02, 28 June 2010 (UTC)
- One serious concern I have is that if the contributor refuses to provide records, the effect is an implicit noncommercial license in the US, since their contributions become unusable in the US for commercial purposes. Less sinisterly, some contributors are bound to vanish and become impossible to contact. The question I'd ask in this case is, should media with 2257 requirements whose contributor will not or cannot provide records be deleted? Dcoetzee (talk) 18:48, 28 June 2010 (UTC)
- The same is true of most identifiable pictures subject to "personality rights". I think that all that WMF can do is try to get the copyright obstacle out of the way by having free copyright licenses - and unfortunately, we should know that as copyright becomes less effective as a means for controlling who gets to contribute to the public discourse of nations, those in positions of greater power will inevitably invent many other mechanisms like this to reassert their control. We can only do what we can when we can. Wnt (talk) 21:23, 28 June 2010 (UTC)
- And almost all trademarked work. In general the reuse requirement that we have has only been applied in relation to copyright restrictions, not to restrictions by other legal limitations. Defamation is also not legal in some contexts. We have to draw a line somewhere. We have chose to draw it at copyright, and our content warning templates are for cases where other problems might exist. TheDJ (talk) 21:35, 28 June 2010 (UTC)
- The same is true of most identifiable pictures subject to "personality rights". I think that all that WMF can do is try to get the copyright obstacle out of the way by having free copyright licenses - and unfortunately, we should know that as copyright becomes less effective as a means for controlling who gets to contribute to the public discourse of nations, those in positions of greater power will inevitably invent many other mechanisms like this to reassert their control. We can only do what we can when we can. Wnt (talk) 21:23, 28 June 2010 (UTC)
- One serious concern I have is that if the contributor refuses to provide records, the effect is an implicit noncommercial license in the US, since their contributions become unusable in the US for commercial purposes. Less sinisterly, some contributors are bound to vanish and become impossible to contact. The question I'd ask in this case is, should media with 2257 requirements whose contributor will not or cannot provide records be deleted? Dcoetzee (talk) 18:48, 28 June 2010 (UTC)
- Exactly. I understand why people were considering it, but the nature of privacy in OTRS (it aint a vault) and the simple impracticality of doing this doesn't make it realistic. The other argument was "what if we change this in the future". Well I guess then we are screwed and will lose a lot of content, but so be it, because I doubt it would have worked, so we would lose most of that content anyways. TheDJ (talk) 16:02, 28 June 2010 (UTC)
- Despite some initial surprise, I think this is a positive change. We don't want or need to manage these records ourselves - we just need to get content reusers in contact with record keepers. Dcoetzee (talk) 15:17, 28 June 2010 (UTC)
- Good work Wnt, I think this is much more useful. TheDJ (talk) 15:03, 28 June 2010 (UTC)
Artwork?
[edit]Forgive me if this has been discussed- I was directed here from the Village Pump at en:WP. The page says that a file should not be speedied if "The material is an artwork, including, but not limited to, paintings, engravings, etchings, lithographs, needlework, and sculpture." The question of determining what is and is not a work of art is pretty much unanswerable. Our policies and guidelines should not require editors to agree on what is or is not an artwork. "Art" certainly includes photographs and films, which seem to be the focus of this policy.
Perhaps what is meant is something like "The material is a historical artwork", which would (somewhat) prevent people from using the "art" label as a way around the policy. Staecker (talk) 14:03, 27 June 2010 (UTC)
- The intent was that that should cover everything that wasn't a photograph of a real person, I believe. Exact replication by means of silver emulsion and CCDs can be speedied, but things painted, sewed, or hacked out of stone can't.--Prosfilaes (talk) 14:42, 27 June 2010 (UTC)
- Then it should just say "The material is not a photograph". Using the word "artwork" to mean non-photograph is confusing and inaccurate. I still think it's a bogus distinction, though. What about photo-realistic CG or other media? What about digitally manipulated photographs? There can be no meaningful distinction between photographs and other types of visual media. Staecker (talk) 17:09, 27 June 2010 (UTC)
- The non-photograph thing was certainly not my interpretation - there are plenty of sexual artistic photographs on Commons, and not all of them are even historical. A notable one is File:Horny (nude by Peter Klashorst).jpg. There is the concern that virtually any work can be considered artwork, but precisely the same problem exists in law (e.g. for obscenity law, determining whether a work is art is part of the Miller test). I think this is one of those "just use your best judgement" things. Dcoetzee (talk) 20:18, 27 June 2010 (UTC)
- Fully agree with you on that Dcoetzee. The important thing is that we don't speedy it. It still might be deleted in the end of course. Also note that some randome editor saying "art" does not make it so, notable references do. TheDJ (talk) 15:58, 28 June 2010 (UTC)
- If it doesn't mean non-photograph, then why enumerate the mediums here, "paintings, engravings", etc. especially when you're leaving out what is likely the most common medium on Commons, photography?--Prosfilaes (talk) 16:12, 28 June 2010 (UTC)
- Because in the area of sexual artwork, the large majority is presently non-photographic, so it simply wasn't mentioned. I'll add it just to be clear. Dcoetzee (talk) 18:50, 28 June 2010 (UTC)
- The non-photograph thing was certainly not my interpretation - there are plenty of sexual artistic photographs on Commons, and not all of them are even historical. A notable one is File:Horny (nude by Peter Klashorst).jpg. There is the concern that virtually any work can be considered artwork, but precisely the same problem exists in law (e.g. for obscenity law, determining whether a work is art is part of the Miller test). I think this is one of those "just use your best judgement" things. Dcoetzee (talk) 20:18, 27 June 2010 (UTC)
- Then it should just say "The material is not a photograph". Using the word "artwork" to mean non-photograph is confusing and inaccurate. I still think it's a bogus distinction, though. What about photo-realistic CG or other media? What about digitally manipulated photographs? There can be no meaningful distinction between photographs and other types of visual media. Staecker (talk) 17:09, 27 June 2010 (UTC)
Audio?
[edit]Should the policy mention audio files somewhere? They can certainly be inappropriate, but the Definition section only deals with media "depicting" various things, which is a word that doesn't apply very well to audio. Staecker (talk) 14:05, 27 June 2010 (UTC)
- I don't see why it should. I've never heard of a court case covering audio, and USC 2257 specifically mentions visual representations. Not only that, I don't recall a single inappropriate audio file being uploaded to Commons, and I don't like making rules for hypothetical situations. An audio file can be deleted for being out of scope just like any other work; I don't see why we need special rules for it.--Prosfilaes (talk) 14:49, 27 June 2010 (UTC)
- Surely you've heard of 2 Live Crew? I'm not sure that's the kind of court case you're looking for. In any case I agree about not making rules where no problems exist. Staecker (talk) 17:10, 27 June 2010 (UTC)
- 2 Live Crew got prosecuted (and exonerated) under obscenity law, which we do cover - I'll add a small note that obscenity applies to all types of media. Dcoetzee (talk) 20:13, 27 June 2010 (UTC)
- Surely you've heard of 2 Live Crew? I'm not sure that's the kind of court case you're looking for. In any case I agree about not making rules where no problems exist. Staecker (talk) 17:10, 27 June 2010 (UTC)
Blatantly outside of scope?
[edit]What does "blatantly outside of scope" mean? It is defined nowhere on the page, yet it is being used as a criterion that is enough to speedy delete an image. And yes, I do think that things that are blatantly outside of scope should be deleted, but apparently the scope that is thought of here is much narrower than what I would call our scope. Because why else would we need all those exceptions? To me, anything falling under those exceptions would be in scope, or at most borderline outside scope. But here they are as exceptions to the rule that something blatantly out of scope can be speedily deleted. If artworks, material of scientific value, things realistically useful for educational purpose, things actually used for educational purposes on a project or things that are of themselves notable all can be blatantly out of scope, then I do not agree with speedy deleting other things that are blatantly outside of scope, because then there are undoubtedly things blatantly outside of scope that would be blatantly in scope in my definition. - Andre Engels (talk) 16:07, 27 June 2010 (UTC)
- The language here requires clarification. Those things are not "exceptions" so much as they are "a list of things that are in scope that some people seem to forget are in scope when the media in question is sexual in nature." I took a stab at it, let me know what you think. Dcoetzee (talk) 15:39, 28 June 2010 (UTC)
- I still think this is too narrow - it now says that anything not in these categories can be deleted for being out of scope. In my opinion, any image that could reasonably be used to illustrate a Wikipedia article or Wikibooks book should be considered in scope, although it can still be deleted if there are other alternatives which are at least as good. At the moment these are only excluded from scope considerations if they are actually in use. What I mean is this: A low quality, according to many people obscene picture of a copulating couple I would like to save and thus definitely not give the possibility of a speedy, if it happens to be our only photograph of people having sex doggy style. If on the other hand we already have high quality, less objectionable imagery that depict in essence the same scene, I have no problems with removing it. - Andre Engels (talk) 06:42, 3 July 2010 (UTC)
Good work, policy-vs-advice, and content-neutral scope
[edit]Some excellent work here, thanks to all who've tried to sort the mess out.
Most of what's here is covered by existing status and is thus non-controversial--
- "No Images that are illegal in FL,US" derives from our exist "legal content only" policies. No Child porn, No copyrighted, etc. Totally non-controversial.
- "No Non-notable images of people taken in private who haven't consented and have asked us to take it down" derives from our exist privacy policies. Non-controversial.
The potentially controversial parts are the 'legal advice' sections and the 'scope' sections. My thinking:
- If we want to provide some legal advice to our editors, cool-- but we shouldn't do so on a policy page. Making our own home-grown legal advice on the policy page blurs the line between law and policy. Our POLICY is straightforward: 'follow the law'. Our legal summaries, opinions & advice are neither law nor policy. Ultimately only OFFICE can competently make those calls. In the unlikely case that an otherwise-legal image has an Obscenity/Miller/2257 concern, just let OFFICE deal with it.
- Mixing concerns of 'scope' and concerns of sexuality is controversial. After all, if we're truly not censored, then the taboo nature of the content of the image is irrelevant. A useless picture is a useless picture, whether it's sexualized or not.
Lastly, I think in the future, nearly all legal free-license images will be considered sufficiently educational as to be in scope. Right now, we aren't flickr because that wouldn't be a wise use of our limited computing resources.
But in time, image space will be treated much like we treat space for text-- essentially unlimited. Currently, we have lots of text on talk and elsewhere that may be of very limited cultural or educational value-- but in practice, we don't bother to stress over it. So it should be, one day, with images, once the bytes have gotten sufficiently inexpensive.
--Alecmconroy (talk) 14:47, 28 June 2010 (UTC)
- I think it's important to discuss relevant legal restrictions in the policy because I've seen them more than once used as justifications for deletion, e.g. "delete because we don't have 2257 records" or "delete because obscene" and making clear our stance about how we deal with these issues is important (for example, we aren't a 2257 secondary producer, and obscenity is too subjective for speedy deletion). It would get annoying to have to continually refute these kind of claims. This isn't so much legal advice as "how do these laws affect content inclusion and processes on Commons?"
- Scope is also important to discuss not in that I think scope is more important as applied to sexual content, but because it is frequently applied too broadly to sexual content and it's important to clarify that e.g. images in use don't get a scope exemption just because they're sexual, and historic art isn't out of scope. This should be obvious, but apparently not. It's also just a discussion of a relevant policy that's frequently applied to deletion of sexual content in practice - documenting practice is useful. I do agree with you that the scope of Commons is expanding over time.
- That said I would support clarifying these points somehow in the text. Dcoetzee (talk) 15:13, 28 June 2010 (UTC)
- edit conflict
- legal advice I have pondered that perhaps this part should be split into a separate page, but I think it will make the information harder to find and it really is difficult finding all that on your own. I think we should simply keep it very isolated from the rest of the content, perhaps even more so than we do now. TheDJ (talk) 15:22, 28 June 2010 (UTC)
- The big problem is that by splitting up scope, morals and law, you lose the overview. Much of the problems that we have had in the past is that we told people: "don't upload stuff that is illegal, don't upload stuff that is out of scope" and that especially the finesses regarding that in the context of Sexual content were too complicated for people to distill themselves. Much of this entire 'policy' is already policy or practice now I think, as such you could argue we don't need this as a policy in the first place and that argument has been raised several times. I think however that whatever 'mark' this page is bestowed with is not really relevant. Sometimes targeted guidelines for a single subject simply are needed. If some folks think that doesn't deserve to be called a policy then that is fine as well with me, we can call it a guideline or whatever, as long as we almost all agree that the information is good and proper. TheDJ (talk) 15:22, 28 June 2010 (UTC)
- Good point OP about info vs. policy. This had been discussed before and I restored some of the older structure along with the info tag. Please leave the as-is for a few days, or until a consensus forms as to whether we want to split the page. I fully support doing so. As to whether or not we should keep or delete seemingly useless files of course does have a practical consideration as to server resources, however there seems to be a type of pathological hoarding on Commons that isn't allowed on Wikipedia that should be limited. Robert Pirsig's discussions on the Metaphysics of Quality come to mind. Do we want a quality (well organized, useful, relevant, etc) collection of freely-licensed images or do we want to simply have the largest collection possible? - Stillwaterising (talk) 19:05, 28 June 2010 (UTC)
- We have one video of fellatio. One. File:Cynopterus sphinx fellatio.ogg. As you can see, it's of bats. The current collection of sexually explicit material is massively, grossly insufficient to support one good Wikibook on sex, the equivalent of w:The Joy of Sex. I find the constant ranting on "hoarding" by those that attack only sexually explicit material to be practically dishonest.--Prosfilaes (talk) 20:32, 28 June 2010 (UTC)
Legal Considerations section
[edit]I removed the section heading "other legal considerations" (leaving the content) and the tag proposing to split it onto a subpage; Stillwaterising restored it as "Legal Considerations" with the proposed split tag. I don't like this section header because: 1. earlier in the policy we discuss legal considerations (child pornography) so they're not all contained in this section; 2. making a section for "other" legal considerations just seems like bad style. I'm not sure, but I think we also have consensus against a split - a couple people like this idea but I think it's best to have all relevant concerns on one page. Additionally, these sections contain normative policy statements about the {{2257}} tag and speedy deletion of obscene images, among other things, so they're not just "advice." Even if we do decide to make these sections non-normative, I still think they should be described as such, but kept on this page, in one place - putting them on a subpage is a great way to make sure nobody ever reads them. I'm proposing removing this section header and proposed split tag again. Alternatively, I would be willing to settle for replacing the split proposal tag with a notice that "the statements in this section are non-normative." Dcoetzee (talk) 19:02, 28 June 2010 (UTC)
- I do think the information should be split but I don't care what the subpage is named. - Stillwaterising (talk) 20:09, 28 June 2010 (UTC)
- I really don't like having policy subpages. It's too easy to miss them, and people end up arguing in ignorance of previous decisions and wasting a whole lot of time and effort. Also, the two components of this section are fundamentally different - there's the 2257 part, which amounts to putting a template and maybe a link in some pictures, and the obscenity bit, which comments on the criteria for prohibited material at the beginning. They have absolutely nothing to do with each other. Wnt (talk) 22:04, 28 June 2010 (UTC)
- I don't like subpages either. It's not helping anyone in my opinion. TheDJ (talk) 22:04, 28 June 2010 (UTC)
- Here's another reason to split: easier to translate. Translation can focus on the policy section and the information section can be autotranslated. I don't think it will be too difficult to find with a "for further information link" and a short summary. I think the split should happen soon, before voting begins. - Stillwaterising (talk) 22:57, 28 June 2010 (UTC)
- I still don't like referring to this as a "further information" section. It's not advice and recommendations, it describes actions that we expect users to follow. In any case, do we really not have the ability to translate some sections and not others? Dcoetzee (talk) 00:15, 29 June 2010 (UTC)
- Here's another reason to split: easier to translate. Translation can focus on the policy section and the information section can be autotranslated. I don't think it will be too difficult to find with a "for further information link" and a short summary. I think the split should happen soon, before voting begins. - Stillwaterising (talk) 22:57, 28 June 2010 (UTC)
- I don't like subpages either. It's not helping anyone in my opinion. TheDJ (talk) 22:04, 28 June 2010 (UTC)
- I really don't like having policy subpages. It's too easy to miss them, and people end up arguing in ignorance of previous decisions and wasting a whole lot of time and effort. Also, the two components of this section are fundamentally different - there's the 2257 part, which amounts to putting a template and maybe a link in some pictures, and the obscenity bit, which comments on the criteria for prohibited material at the beginning. They have absolutely nothing to do with each other. Wnt (talk) 22:04, 28 June 2010 (UTC)
Section: Disagreement on borderline cases
[edit]I don't support the requirement to immediately list all sexual content speedy deletions at Commons:Undeletion requests. Admins should be free to make decisions based on policy without justifying such decisions every single time they do it. Why do we elect people to be admins if we don't trust them with the tools? Kaldari (talk) 21:45, 28 June 2010 (UTC)
- The intention here seems to be that the rare speedy deletions against policy (presumably due to urgent legal risk) should be brought to the notice of the community for further consideration. I have no idea how to convey this accurately. Dcoetzee (talk) 22:53, 28 June 2010 (UTC)
- JayWalsh said here that if we "believe material is illegal, it will normally be immediately deleted." I think this policy should reflect our public policy and not viceversa. - Stillwaterising (talk) 23:10, 28 June 2010 (UTC)
- If that is the intention of that section, it is completely unclear. Is that section only talking about legal issues or both scope and legal issues? Kaldari (talk) 23:41, 28 June 2010 (UTC)
- It seems to be describing an "emergency" process for legal issues only. Yes, the intent is completely unclear. I'd personally support removing this section altogether, as I don't really believe there are laws which will cause us to face severe penalties if we don't respond within seconds. If we do keep it we might rename it "emergency deletion" or some such thing. Dcoetzee (talk) 00:17, 29 June 2010 (UTC)
- I renamed the thread because I was unclear about what was being discussed initially. Agreed that the section is poorly written. I understand the intent and agree with it. I think it will either get defined now or come up again later. I propose "urgent deletion". - Stillwaterising (talk) 03:05, 29 June 2010 (UTC)
- I've rearranged the content in the speedy delete section so that related information is presented together. This should make it a lot easier to parse what is being talked about. Kaldari (talk) 20:08, 29 June 2010 (UTC)
- I'm okay with this editing. Dcoetzee (talk) 02:45, 30 June 2010 (UTC)
- I've rearranged the content in the speedy delete section so that related information is presented together. This should make it a lot easier to parse what is being talked about. Kaldari (talk) 20:08, 29 June 2010 (UTC)
- I renamed the thread because I was unclear about what was being discussed initially. Agreed that the section is poorly written. I understand the intent and agree with it. I think it will either get defined now or come up again later. I propose "urgent deletion". - Stillwaterising (talk) 03:05, 29 June 2010 (UTC)
- It seems to be describing an "emergency" process for legal issues only. Yes, the intent is completely unclear. I'd personally support removing this section altogether, as I don't really believe there are laws which will cause us to face severe penalties if we don't respond within seconds. If we do keep it we might rename it "emergency deletion" or some such thing. Dcoetzee (talk) 00:17, 29 June 2010 (UTC)
- If that is the intention of that section, it is completely unclear. Is that section only talking about legal issues or both scope and legal issues? Kaldari (talk) 23:41, 28 June 2010 (UTC)
- JayWalsh said here that if we "believe material is illegal, it will normally be immediately deleted." I think this policy should reflect our public policy and not viceversa. - Stillwaterising (talk) 23:10, 28 June 2010 (UTC)
Again about the "age of 25"
[edit]First sorry for my english.
The proposed policy say "Uploaders of images which feature participants who appear to be under the age of 25 will be required to clarify, and possibly evidence, the consent and the ages of all involved." IMO this can cause some problem and inconsistencies with the other text:
- We say that Commons is not obligated to keep (and verify via OTRS) records about the identity and age of the people that appear on the photos. but so how the uploader can prove that the subject isn't a minor (we are taking of "age of 25", so isn't the case of pictures with clearly child, but with 16/17 years old people that can seem more older)? If it is enough that the uploader say "belive me", and we then Assume good faith, why we can't simple assume if from the beginning (from the upload) and don't dispute the age of the subject? and if we have doubt (maybe the user don't seem to be reliable) him words would be enough?
- If we ask for Identity document of the model(s) to prove that they are real 18-25+ years old peoples, we must process and verify it, and we must limit the upload of this kind of pictures from nation where the photographer is obliged to keep these record (and can freely share them without violating the law, IE in European nations the photographer must verify the age, but many privacy law prohibiting him from spreading these personal data to third parties).
- Why 25? there are peoples of 30 years old that clearly aren't (and don't' appear) minor, but can "appear to be under the age of 25". We must ask also for these pictures? How far can go the limit for this age?
- If the upload isn't longer active on wikipedia/take the pictures from an other source (ie flickr), and don't'/can't answer what's happen? Speedy deletion or normal deletion?
If the previus sentence, with the "not all participants appear to be at least 18 years of age" is simple and can help, the period with the "age of 25" IMHO put only further confusion in this policy.--Yoggysot (talk) 02:28, 29 June 2010 (UTC)
- The original idea was to introduce a degree of conservatism that would allow us to more reliably (or more frequently) identify and exclude photos of teenagers who appear to be 18 or older. The cutoff was chosen arbitrarily (based on the policy some bars use). I'm not sure how to do this in a reliable manner, or if we need to. Dcoetzee (talk) 03:23, 29 June 2010 (UTC)
- The reason for the inconsistencies is because this document has many authors and we don't agree on this issue. It is very difficult to can tell what age a young adult is from a photograph with beyond a reasonable doubt accuracy due the large variations in human anatomy and physiology. Without implementing strict 2257 policies I think there could be a system where if two or more established contributors think it's too young then it should be speedy deleted without bias, then reported to an oversighter. The only way the image could come back into play is if the legal "department" says it's ok and restore the image. - Stillwaterising (talk) 13:23, 29 June 2010 (UTC)
- Well here's a suggestion based on current practice. If someone believes a subject may be underage, they can nominate it for deletion for that reason; persons in the deletion discussion can gather evidence based on the image source and their own subjective impressions, and the result is defined by consensus. Dcoetzee (talk) 14:01, 29 June 2010 (UTC)
- The reason for the inconsistencies is because this document has many authors and we don't agree on this issue. It is very difficult to can tell what age a young adult is from a photograph with beyond a reasonable doubt accuracy due the large variations in human anatomy and physiology. Without implementing strict 2257 policies I think there could be a system where if two or more established contributors think it's too young then it should be speedy deleted without bias, then reported to an oversighter. The only way the image could come back into play is if the legal "department" says it's ok and restore the image. - Stillwaterising (talk) 13:23, 29 June 2010 (UTC)
- The problem with a "two established contributors" standard is that it lets anybody purge Commons' entire collection of nude images: spend six months getting a pair of sockpuppets set up, then raise doubts about the age of the subjects. --Carnildo (talk) 21:54, 29 June 2010 (UTC)
- No need for socks. You only need two editors who agree that nude pics are all wrong. --Enric Naval (talk) 09:08, 30 June 2010 (UTC)
- The problem with a "two established contributors" standard is that it lets anybody purge Commons' entire collection of nude images: spend six months getting a pair of sockpuppets set up, then raise doubts about the age of the subjects. --Carnildo (talk) 21:54, 29 June 2010 (UTC)
- I've revised the contentious sentence to instead read: "users in the deletion discussion will gather evidence of the subject's age and conservatively evaluate whether there is a significant risk that a minor is depicted." I think this matches current practice pretty well and is fairly safe. Dcoetzee (talk) 13:58, 30 June 2010 (UTC)
Mention in Signpost
[edit]This proposal got a brief mention yesterday in the Signpost story "Objectionable material". Maybe will help get more eyes reviewing it. Dcoetzee (talk) 02:48, 30 June 2010 (UTC)
As if there wasn't enough controversy....
[edit]- Moved to COM:AN/B. -mattbuck (Talk) 12:02, 2 July 2010 (UTC)
consent again....
[edit]I've changed the paragraph to require an assertion of consent of participants at time of upload - to my mind this is alot clearer has advantages per Ottava above (possibly requiring an editor to break the law in order to break our rules) and is far far easier to manage / apply. Privatemusings (talk) 00:44, 6 July 2010 (UTC)
- further - the paragraph currently states that 'Such content is subject to Commons:Photographs of identifiable people...' - is this intended to mean that sexual content is subject to the measures in that guideline (including for example the 'moral issues' mentioned) regardless of whether or not the subjects are identifiable (I would support this) - in which case it could be more clearly put. Otherwise, the sentence is of course a wee bit redundant, and possibly a bit misleading (not all sexual content would be subject to the guidelines - only sexual content where participants are identifiable, which in the past has meant the full face is pictured clearly with no obstructions) - I would support adding at least a 'if the participants faces are pictured, then they are covered under....' etc. Privatemusings (talk) 02:01, 6 July 2010 (UTC)
- I've revised this again to read as follows - I think this is a strong statement, while still addressing the concerns of people who are worried about sexual images being spuriously deleted merely because the uploader forgot to include a consent statement:
- "For media containing identifiable persons, consent must be asserted at the time of upload in the image description, and if this consent statement is questioned in a deletion request, additional evidence may be sought. Sexual content without consent statements may be marked with the {{Noconsent}} tag and deleted after a period of three days (existing images uploaded before July 2010 are excluded)."
- Comments? Dcoetzee (talk) 05:49, 6 July 2010 (UTC)
- I like it - except I believe we need to drop the 'for media containing identifiable persons' bit - all sexual content should be subject to this clause in my view. Privatemusings (talk) 06:22, 6 July 2010 (UTC)
- Hrm, I added that part due to your above statement - maybe I misunderstood it? Dcoetzee (talk) 07:26, 6 July 2010 (UTC)
- I agree with this. I have tweaked it slightly: from
- "for sexual content, consent must be asserted at the time of upload ..."
- to
- "for sexual content, consent to the upload by the persons shown must be asserted at the time of upload ..."
- This is just to make clear that people must have consented to the upload, rather than to having their picture taken.
- I think identifiability is immaterial here -- if the person is not identifiable, consent should be easier for the uploader to obtain, and we have to bear in mind that personal names are sometimes included in the filename or description, and that the uploader obviously knows who the person is and may disseminate that information, in breach of personality rights. --JN466 09:38, 6 July 2010 (UTC)
- Thinking about it, I think this means that Flickr transfers will be out. Are you all okay with this? --JN466 09:44, 6 July 2010 (UTC)
- Not entirely, for instance, it is quite clear that Suicide Girl models are consenting adults, aware of the publication of their photographs. TheDJ (talk) 10:11, 6 July 2010 (UTC)
- If someone (you?) is willing to state that they *have* consented to this image being uploaded, then you can note it on the images, and they will be retained (if you can then back it up a DR if required). This is a clear test of how sure we actually are. --99of9 (talk) 10:19, 6 July 2010 (UTC)
- Yes, if it is a named outfit, like Suicide Girls, and they have released the image on Flickr under a suitable licence, then model consent can be assumed. --JN466 10:50, 6 July 2010 (UTC)
- If someone (you?) is willing to state that they *have* consented to this image being uploaded, then you can note it on the images, and they will be retained (if you can then back it up a DR if required). This is a clear test of how sure we actually are. --99of9 (talk) 10:19, 6 July 2010 (UTC)
- Not entirely, for instance, it is quite clear that Suicide Girl models are consenting adults, aware of the publication of their photographs. TheDJ (talk) 10:11, 6 July 2010 (UTC)
- Well done on the recent changes, they are a clear improvement. I modified the 2010 exemption to just be an extended grace period since the consent rules have always existed - it's the conscientiousness of the labelling that is changing. --99of9 (talk) 10:16, 6 July 2010 (UTC)
- Am I correct in assuming that the noconsent template still needs to be created? Is currently a red link. --JN466 15:35, 6 July 2010 (UTC)
- Yes, I was speculating about the future creation of such a template for this purpose. I'm okay with change regarding older images. Dcoetzee (talk) 18:02, 6 July 2010 (UTC)
- I think this is a step in the wrong direction. If we can assume that photos of "named" principals (SuicideGirls???) are uploaded to Flickr with consent, then why not make the same assumption of good faith for uploaders to Wikimedia Commons? (if we don't make the first assumption, then it means that just about any CC-licensed photos we find and want to use here are banned by this policy) Consent is manufactured simply by the uploader adding a statement that yes, this photo really has consent. Also, the policy as written has the bizarre contradiction that a person can't upload photos of sexual content taken in a public place without making a statement of consent - but we won't take down such photos even if the person depicted writes the WMF and pleads for them to be deleted.
- I think we should simply say that sexual content requires consent, but not require any boilerplate statement or template. Wnt (talk) 03:50, 13 July 2010 (UTC)
- Because for suicidegirls we have an OTRS ticket that confirms that the suicidegirls flickr account is owned by suicidegirls.com TheDJ (talk) 15:55, 13 July 2010 (UTC)
- Yes, I was speculating about the future creation of such a template for this purpose. I'm okay with change regarding older images. Dcoetzee (talk) 18:02, 6 July 2010 (UTC)
- Am I correct in assuming that the noconsent template still needs to be created? Is currently a red link. --JN466 15:35, 6 July 2010 (UTC)
- I like it - except I believe we need to drop the 'for media containing identifiable persons' bit - all sexual content should be subject to this clause in my view. Privatemusings (talk) 06:22, 6 July 2010 (UTC)
Poll for promotion to policy
[edit]Now that we've solicited feedback from a variety of sources and made revisions, I feel like the policy has converged on something we can all agree with. This is a poll to adopt Commons:Sexual content in its current form as a Wikimedia Commons policy. I feel like this is an important step towards answering many common questions and concerns that we receive about this type of content, and providing a systematic way of dealing with it on a day-to-day basis. Please give your opinion below with {{Support}}, {{Oppose}}, or {{Neutral}}, with a brief explanation. (I'll advertise this poll shortly unless someone here feels like it's premature.) Dcoetzee (talk) 19:42, 2 July 2010 (UTC)
Note: this poll has not achieved consensus, and a new poll will begin once the primary outstanding issues are addressed.
- Support as nominator. Dcoetzee (talk) 19:42, 2 July 2010 (UTC)
- Oppose it is far too deep in lawyer territory, fixes a problem that does not exist. Kameraad Pjotr 20:52, 2 July 2010 (UTC)
- There is a long and proven history of such problems, and your statement does not match the reality of the situation. Ottava Rima (talk) 23:06, 5 July 2010 (UTC)
Editors calling eachother out on unrelated issues |
---|
::You have also made statements against blocking those who promote pedophilia and the rest, suggesting a fringe point of view and unwillingness to accept our policies. Ottava Rima (talk) 23:06, 5 July 2010 (UTC)
|
- Support It is not perfect and will evolve, like all policies, but it is a good place to start, and gives a basic overview of some of the legal issues involved -- which I actually consider vital in this case. --JN466 02:56, 3 July 2010 (UTC)
- Support, absolutely. The editors here that do not believe there is a problem have serious problems. The policy sufficiently describes what should and must be done to preserve project decorum. Sincerely, Blurpeace 04:06, 3 July 2010 (UTC)
- Support if and only if "This does not imply that consent must be verified at the time of upload, but the uploader should be able to demonstrate consent if challenged and should add a comment about consent when uploading." is amended to saying "This does imply that consent must be verified at the time of upload", otherwise the policy is meaningless. Whenever there will be a challenge, the person challenging will come under attack as proven before. Therefore, it needs to be automatic to remove the harassment. Ottava Rima (talk) 04:09, 3 July 2010 (UTC)
- Also, the statement by the EFF should be removed as completely not grounded within law and violating the idea that we do not provide legal counsel, especially second hand by those who are clearly partisan. Ottava Rima (talk) 04:12, 3 July 2010 (UTC)
- I agree that the section on 2257 provides too much legal advice to producers and I've tried to par it down, although I don't want to change the page too much during the poll. I disagree regarding your other point, as I think requiring proof of consent at the time of upload would create too large of a barrier to contribution. Dcoetzee (talk) 04:54, 3 July 2010 (UTC)
- Irrespective of Ottava's suggestion, the present sentence "This does not imply that consent must be verified at the time of upload" is ambiguous (verified by whom?) and should be reworded. The uploader should certainly verify that he or she has the consent of the person who is depicted before uploading an image. --JN466 05:10, 3 July 2010 (UTC)
- Agreed, I revised it. Dcoetzee (talk) 05:33, 3 July 2010 (UTC)
- The only barrier will be against those who steal pictures of others. Am I the only one who remembers Poetlister? Wikipedia should not be a means for identity theft and the rest. Minimal protections are required at the minimum. Ottava Rima (talk) 05:14, 3 July 2010 (UTC)
- You may be, so here I found the links: [24][25] In that case it would have been best for the girlfriend to make the complaint (for all I know she might have) just to avoid the risk that the boyfriend wasn't telling the truth; but it looks like the "puppetry" issues preempted the rest. Wnt (talk) 18:15, 3 July 2010 (UTC)
- IMO it's crazy to wait for the subject themselves to find a nude image of themselves on Wikipedia before we are willing to delete it. We already have some policy on this Commons:Project scope/Evidence, and the burden of proof is on the uploader, not the humiliated girl. The only question is when they should be asked to prove it. Personally I agree with Ottava that it's cleanest if done at the point of upload. --99of9 (talk) 03:48, 4 July 2010 (UTC)
- How? I don't see any feasible way of making sure the uploader has the consent of the girl at the time of the upload.--Prosfilaes (talk) 11:22, 4 July 2010 (UTC)
- Why would it be any harder at the point of upload? 99of9 (talk) 12:07, 4 July 2010 (UTC)
- The "evidence" for public domain status, as cited from the Project Scope subpage, usually consists of an uploader saying "I took this photograph myself". It's not exactly stand-up-in-court kind of evidence. If what you're asking is that the uploader likewise write that "I took this photograph with the subject's consent" on the upload, then it would seem mostly harmless, except that we would lose a lot of old photographs where the uploader never dreamed he should need to say such a thing. But if that's not good enough, then what evidence would be? If we get an e-mail from the "subject" authorizing the photo - how do we know that's really the subject?
- By contrast, if we wait to get an e-mail from the subject objecting to a photo, then this is rather rare, and can be examined closely to try to get the sense of whether it's a legitimate complaint; or people could just delete the image if they decide it doesn't matter much. Wnt (talk) 14:19, 4 July 2010 (UTC)
- Porn websites are able to verify that the person is legitimate, and having records would allow people who falsify said records to be tried in court under identity theft laws, which would allow substantial recourse for those who had Wikimedia used to steal identity/promote naked pictures against their will. Ottava Rima (talk) 23:05, 5 July 2010 (UTC)
- Why would it be any harder at the point of upload? 99of9 (talk) 12:07, 4 July 2010 (UTC)
- How? I don't see any feasible way of making sure the uploader has the consent of the girl at the time of the upload.--Prosfilaes (talk) 11:22, 4 July 2010 (UTC)
- IMO it's crazy to wait for the subject themselves to find a nude image of themselves on Wikipedia before we are willing to delete it. We already have some policy on this Commons:Project scope/Evidence, and the burden of proof is on the uploader, not the humiliated girl. The only question is when they should be asked to prove it. Personally I agree with Ottava that it's cleanest if done at the point of upload. --99of9 (talk) 03:48, 4 July 2010 (UTC)
- You may be, so here I found the links: [24][25] In that case it would have been best for the girlfriend to make the complaint (for all I know she might have) just to avoid the risk that the boyfriend wasn't telling the truth; but it looks like the "puppetry" issues preempted the rest. Wnt (talk) 18:15, 3 July 2010 (UTC)
- Irrespective of Ottava's suggestion, the present sentence "This does not imply that consent must be verified at the time of upload" is ambiguous (verified by whom?) and should be reworded. The uploader should certainly verify that he or she has the consent of the person who is depicted before uploading an image. --JN466 05:10, 3 July 2010 (UTC)
- I agree that the section on 2257 provides too much legal advice to producers and I've tried to par it down, although I don't want to change the page too much during the poll. I disagree regarding your other point, as I think requiring proof of consent at the time of upload would create too large of a barrier to contribution. Dcoetzee (talk) 04:54, 3 July 2010 (UTC)
- Also, the statement by the EFF should be removed as completely not grounded within law and violating the idea that we do not provide legal counsel, especially second hand by those who are clearly partisan. Ottava Rima (talk) 04:12, 3 July 2010 (UTC)
- Support Not perfect but yes. Jamesofur (talk) 07:37, 3 July 2010 (UTC)
- Oppose per Kameraad Pjotr; the whole page reads as a scaremongering exaggeration of the legal situation in the USA, with accessing photos of clothed minors carrying mandatory minimum sentences of 5 years in prison. /Pieter Kuiper (talk) 08:36, 3 July 2010 (UTC)
- I don't understand this opinion - you seem to be misinterpreting the entire intent of the proposed policy. The point is to prevent the overzealous deletions of sexual content that have occurred in the past by specifically addressing the policy and legal concerns involved and noting that they (for the most part) are not a concern. In particular, it says that 2257 law does not apply to us, that obscenity law rarely applies at Commons, and that consent does not need to be documented at the time of upload. Some users (Ottava) think it doesn't go far enough. I realise that US law can be frustratingly conservative sometimes but changing the law really isn't on the table here, and we are under US jurisdiction. What would you want to change here? Dcoetzee (talk) 09:11, 3 July 2010 (UTC)
- This page is not balanced. It just presents reasons for endless rows of challenges and for overzealous deletions, "to be on the safe side". But as far as I know, court decisions allowing the sale of David Hamilton girl photography books or Robert Mapplethorpe's boy photography are still perfectly valid in the US. There is no real problem. /Pieter Kuiper (talk) 09:34, 3 July 2010 (UTC)
- It does say at the beginning that sexual content has to be sexually explicit and not mere (full or partial) nudity, of the sort you would see in these books of photography - would it help if this were emphasized more clearly? Dcoetzee (talk) 10:03, 3 July 2010 (UTC)
- Hamilton and Mapplethorpe go beyond "mere nudity" - their photography includes distinctly erotic images of minors. And that is ok, even in the present climate in the US. The policy page I regard as beyond fixing - too much lawyerspeak already, and too much ammunition for censorship activists. /Pieter Kuiper (talk) 17:34, 3 July 2010 (UTC)
- There's been some disagreement involving the appropriate degree of legal paranoia. The problem is that many laws and regulations are so vague, the court tests so subjective, that the pattern of prosecution appears more like terrorism than law enforcement, with no one really able to predict what will or won't be taken to court. If you read up on the ACLU cases you'll see that people have been charged with child pornography for "sexting" on cell phones or writing in diaries, and it's hard to tell just where the one-off crank prosecutions end and where serious law enforcement begins. But I don't think any of us want Wikipedia to be dragged into court if we don't have broad agreement at least that it ought to win. I hope you'll look at the proposal again and share with us where you see the greatest flaws. Wnt (talk) 18:05, 3 July 2010 (UTC)
- Hamilton and Mapplethorpe go beyond "mere nudity" - their photography includes distinctly erotic images of minors. And that is ok, even in the present climate in the US. The policy page I regard as beyond fixing - too much lawyerspeak already, and too much ammunition for censorship activists. /Pieter Kuiper (talk) 17:34, 3 July 2010 (UTC)
- It does say at the beginning that sexual content has to be sexually explicit and not mere (full or partial) nudity, of the sort you would see in these books of photography - would it help if this were emphasized more clearly? Dcoetzee (talk) 10:03, 3 July 2010 (UTC)
- This page is not balanced. It just presents reasons for endless rows of challenges and for overzealous deletions, "to be on the safe side". But as far as I know, court decisions allowing the sale of David Hamilton girl photography books or Robert Mapplethorpe's boy photography are still perfectly valid in the US. There is no real problem. /Pieter Kuiper (talk) 09:34, 3 July 2010 (UTC)
- I don't understand this opinion - you seem to be misinterpreting the entire intent of the proposed policy. The point is to prevent the overzealous deletions of sexual content that have occurred in the past by specifically addressing the policy and legal concerns involved and noting that they (for the most part) are not a concern. In particular, it says that 2257 law does not apply to us, that obscenity law rarely applies at Commons, and that consent does not need to be documented at the time of upload. Some users (Ottava) think it doesn't go far enough. I realise that US law can be frustratingly conservative sometimes but changing the law really isn't on the table here, and we are under US jurisdiction. What would you want to change here? Dcoetzee (talk) 09:11, 3 July 2010 (UTC)
- Oppose per Kameraad Pjotr. Multichill (talk) 08:51, 3 July 2010 (UTC)
- Support Definitely. --High Contrast (talk) 09:16, 3 July 2010 (UTC)
- Support I think this is a good thing. I suggest that others who do not like this consider starting a server outside of US territory. TheDJ (talk) 10:06, 3 July 2010 (UTC)
- Comment I support finalizing a policy, but I don't think this is ready just yet. A couple of issues:
- The "Obviously outside of scope" section only seems to contain a list of reasons why things are NOT obviously outside of scope. Are we meant to assume that these are comprehensive, and that if an image does not fit in any of those categories, it should be speedied? If so, it's an awkward way to explain it.
- "Sexual content uploaded without the consent of the participants" talks about the uploader being able to demonstrate consent. Can we clarify what this means for images sourced from other online repositories, and uploaded to commons by a bot? If we're saying that the original Flikr user needs to be able to demonstrate consent, what happens when they go AWOL or get deleted? This is quite important for the classic case of ex-boyfriend-pretends-to-be-naked-girl uploads. (Personally I think it would be simpler to require OTRS proof of consent at the time of upload, but I'm not too fussed as long as we have some way of ensuring consent, even long after the upload.)
- --99of9 (talk) 13:31, 3 July 2010 (UTC)
- I think that requiring OTRS tickets would use a lot of resources (in light of the current backlog) without really being very effective. The ex-boyfriend can pretend to be the naked girl when sending e-mails too. The demonstration of consent should only be needed if there is a complaint from the subject, and while we might lose a few pictures from people who change their mind that way, I don't think it's really worth fighting them about it, unless there's some third-party photographer involved who is professional enough to do stuff like keep records of consent and be reachable. Wnt (talk) 17:49, 3 July 2010 (UTC)
- Support. I was all ready to oppose this, but was pleasantly surprised to find that its current condition is quite good. There are quibbles to be made, but this is not a vote to freeze the document in its current state. Powers (talk) 15:58, 3 July 2010 (UTC)
- Support Tabercil (talk) 21:52, 3 July 2010 (UTC)
- Neutral As our legal counsel has pointed out, Commons is not covered by the rules covering professional pornographers, and we do a pretty good job of removing material that needs to go (either because it is illegal, or it compromises someone's privacy, or whatever). the WMF projects are much more heavily policed than most websites (along with the concept that anyone can edit is the concept that anyone can police) and I'm sure that's kept us of any trouble up to now. I personally don't see the need to have this page, Commons:nudity, and COM:SCOPE all touch on the same topic. I personally think this should just be a sub-section of scope. Just because I don't see the need for it I won't oppose it though. It's spot on mostly, but it doesn't need to be so legally heavy handed. I think the deletion policies should be beefed up instead to empower administrators to delete more of these images on sight, and talk later, much the way a strong BLP was implemented on en.wiki. -Nard (Hablemonos)(Let's talk) 22:55, 3 July 2010 (UTC)
- Neutral (for the time being). This still needs work. There is too much legal jargon which confuses rather than clarifies the issue for the lay person (include me in that category). Also, there is far too much emphasis on this "Miller test". The "first prong" is just ludicrous: "Whether the average person, applying contemporary community standards, would find that the work, taken as a whole, appeals to the prurient interest." Who is "average" and what are "contemporary community standards". How do we interpret that? Is that just shorthand for "average American..."? Has anyone ever defined that in the US? I doubt it. This is purely subjective and leaves us open to constant argument and (re)interpretation. I'm also confused by this statement: "In legal proceedings, obscenity is frequently identified using the three-prong Miller test". Frequently? So that implies that this is not always the case (in the US presumably). What alternative methods are used, if any? If we are to comply with US and Florida law we should perhaps take them into account or at least mention them. Anatiomaros (talk) 23:47, 3 July 2010 (UTC)
- The three prongs are quoted from the influential Miller decision - if you think they are highly subjective and difficult to interpret, it's because they actually are. I would love to fix this by getting obscenity law repealed, but that's entirely outside of our power. As for "frequently" - that's a nod to the fact that this the Miller test is a strong legal precedent used in almost all obscenity cases, but technically courts are not bound by law to use it (I'm not aware of any case since Miller in which they did not). Finally, as noted, issues of obscenity arise rarely on Commons - I don't believe any file has ever been deleted for this reason.
- If there is confusing legal jargon though I would like to help clarify it - could you help point out which parts you're referring to? Dcoetzee (talk) 00:11, 4 July 2010 (UTC)
- Sorry for the delay in responding to you, Dcoetzee, but commitments at my main project (cy.) and non-virtual life have limited my time here over the last few days. Let me say first of all that I appreciate the work by yourself and others on this and realise it will never be perfect. However - and perhaps this belongs below but I'll say it here anyway - one of the difficulties here is that we are supposed to vote on a policy that keeps being changed. There have been 9 edits to the text over the last 24 hours or so. I have strong doubts about how this policy/guideline may be misused, although I realise of course that is not your intention, so I find it impossible to give even my hesitant and conditional support to a document that may look substantially different in a few days time. One of my main concerns remains the prominence of the farcical "Miller Test" (which has been raised below). The present wording makes it clear that we are not bound by that but this could be changed quite easily. Editing a few words here and there could also produce significant changes of emphasis. Until the document is stable I don't see how I can support it. Anatiomaros (talk) 23:35, 7 July 2010 (UTC)
- I understand - I just wanted to ensure that your primary concerns were raised so that they could be addressed for the next draft for a subsequent poll. I think the obscenity section can be revisited, but this will require some time. Thanks for your feedback! Dcoetzee (talk) 00:08, 8 July 2010 (UTC)
- Sorry for the delay in responding to you, Dcoetzee, but commitments at my main project (cy.) and non-virtual life have limited my time here over the last few days. Let me say first of all that I appreciate the work by yourself and others on this and realise it will never be perfect. However - and perhaps this belongs below but I'll say it here anyway - one of the difficulties here is that we are supposed to vote on a policy that keeps being changed. There have been 9 edits to the text over the last 24 hours or so. I have strong doubts about how this policy/guideline may be misused, although I realise of course that is not your intention, so I find it impossible to give even my hesitant and conditional support to a document that may look substantially different in a few days time. One of my main concerns remains the prominence of the farcical "Miller Test" (which has been raised below). The present wording makes it clear that we are not bound by that but this could be changed quite easily. Editing a few words here and there could also produce significant changes of emphasis. Until the document is stable I don't see how I can support it. Anatiomaros (talk) 23:35, 7 July 2010 (UTC)
- Support for a start --Mattes (talk) 22:19, 4 July 2010 (UTC)
- Oppose Not yet. This page is still unstable. Promoting it to policy effectively freezes it, while there are a lot of good ideas to improve it in the comments above. It seems that the debate went deep into the wording without clearly establishing a consensus on either the purpose or the expected outcome of a policy or guideline. We should wait until we have a stable, concise and clear proposal before a poll for promotion. --InfantGorilla (talk) 14:16, 5 July 2010 (UTC)
- I agree that some important new concerns have been raised in this poll and the proposal should be revised. Dcoetzee (talk) 05:43, 6 July 2010 (UTC)
- I've revised the proposal to add a purpose section to help clearly indicate what I believe is the intended purpose of the policy. I think this will help address your concerns and focus discussion. Dcoetzee (talk) 12:42, 7 July 2010 (UTC)
- Comment: are we intending to promote this to policy or to a guideline? - Jmabel ! talk 17:47, 5 July 2010 (UTC)
- A policy. Although some parts of the proposal just reiterate related policies as they apply to sexual content (which generally falls into the guideline realm), other parts regard legal obligations specific to this type of content which are not covered in other policies, and are not amenable to exceptions. I also feel like those who criticise Commons for lack of regulation in this area want to see us take a strong, clear position. Dcoetzee (talk) 05:43, 6 July 2010 (UTC)
- Comment: This may not be ready yet, but at some point we are going to need to approve some policy. Those objecting need to make their objections actionable, that is, what would need to change so they would support. And "there is no change that would get my support" is a non starter. ++Lar: t/c 00:36, 7 July 2010 (UTC)
- You've got the right idea to prevent a nonsensical filibuster here, but no one is obligated to remain silent if they object to the fundamental premise of a popular proposal. The idea that any single proposal must eventually be passed is antithetical to the very concept of consensus building. — C M B J 11:11, 9 July 2010 (UTC)
- Support My concerns were addressed. 99of9 (talk) 12:15, 7 July 2010 (UTC)
- Oppose per Kameraad Pjotr and on the basis of equality. The day that an international collaborative project begins shaping its policies based on contemporary Western views, while unabashedly disregarding the standards of other cultures, is the day that it begins its descent down a slippery slope of bigotry and hypocrisy. I understand that considerable time and well-intended effort has gone into this proposal, and that such a policy would effectively protect some sexual content from future deletion tirades, but it is my opinion that an independent policy is unnecessary to achieve the desired result and would have eventual repercussions in other controversial areas of the project. We should instead (1) clarify COM:SCOPE#A word on some areas of particular concern based on what we have learned here, and (2) incorporate the legal obligations expressed in this proposal into a single unified legal policy that explains genuine legal obligations of the Wikimedia Foundation as an organization under U.S. jurisdiction, as well as basic precautionary information to protect our end-users from known legal landmines involving blasphemy, confidentiality, endangerment, government classification, intellectual property, libel, personality rights, pornography, subversion, trade secrets, and et cetera when uploading from or reusing content in their places of residence. — C M B J 06:44, 9 July 2010 (UTC)
- Support Like all policies this one will continue to evolve, but it has now reached a point where it actually can be called a policy. In spite of the scepticism expressed by CMBJ I think this policy is not an example of a cultural imposition. It reaffirms our existing policies (like lawfulness and scope) as well as promoting the common value on consent, which I hope most cultures share. Adoption of this policy is a rejection of, for example, a theoretical alternative which imposes the narrow rules a single culture on all the content. This policy isn't perfect— I think it's still too heavy on the arm-chair lawyering, but some of that was recently removed and looks quite workable. Adopting this policy does not preclude improving the legal related recommendations in other policies. Having a topic area policy is also helpful to people working with images: I have an image of type A which issues are the most significant for this image type? --Gmaxwell (talk) 06:51, 9 July 2010 (UTC)
- Oppose Guidelines fail in many aspects. They discriminate legal sexual content (Categorization -> FPC/QI/VI). --Niabot (talk) 09:54, 9 July 2010 (UTC)
- Niabot, there is nothing in this policy that prevents you from nominating your Futanari manga image for featured image, or would give anyone in the project grounds to disallow such a nomination if the image is deemed to be within scope. The FPC/QI/VI process is nowhere mentioned in this policy. I don't see how FPC/QI/VI is a valid reason for opposing. --JN466 11:06, 9 July 2010 (UTC)
- Then you should read the first topic in Commons talk:Featured picture candidates. That images should be categorized in small scope is used as an argument to censor such images in FPC/QI/VI. Currently they get simply removed [26] or hidden by usage of template:hidden in COM:QIC. Don't tell me that this isn't discrimination. In this case the guideline fails completly, since it doesn't handle this aspect. --Niabot (talk) 11:14, 9 July 2010 (UTC)
- No one there has cited the wording of this proposed policy as a reason why your FP nomination should not be allowed. That is because this policy proposal makes no comment whatsoever about the FP process. You will have to fight this battle elsewhere. --JN466 11:27, 9 July 2010 (UTC)
- This shows me, that this proposal isn't developed enough. The other stated "oppose"-arguments apply as well. Respect my opinion. --Niabot (talk) 11:31, 9 July 2010 (UTC)
- It is not helpful, or actionable, if you complain about something that is not even part of this proposal. If you have a specific criticism or improvement suggestion for this proposed text, we'd be grateful if you could articulate it. --JN466 11:48, 9 July 2010 (UTC)
- First you should take care of the already stated critics. Right now i don't feeling stating further issues if not even the issues above got addressed. --Niabot (talk) 12:28, 9 July 2010 (UTC)
- It is not helpful, or actionable, if you complain about something that is not even part of this proposal. If you have a specific criticism or improvement suggestion for this proposed text, we'd be grateful if you could articulate it. --JN466 11:48, 9 July 2010 (UTC)
- This shows me, that this proposal isn't developed enough. The other stated "oppose"-arguments apply as well. Respect my opinion. --Niabot (talk) 11:31, 9 July 2010 (UTC)
- No one there has cited the wording of this proposed policy as a reason why your FP nomination should not be allowed. That is because this policy proposal makes no comment whatsoever about the FP process. You will have to fight this battle elsewhere. --JN466 11:27, 9 July 2010 (UTC)
- Then you should read the first topic in Commons talk:Featured picture candidates. That images should be categorized in small scope is used as an argument to censor such images in FPC/QI/VI. Currently they get simply removed [26] or hidden by usage of template:hidden in COM:QIC. Don't tell me that this isn't discrimination. In this case the guideline fails completly, since it doesn't handle this aspect. --Niabot (talk) 11:14, 9 July 2010 (UTC)
- Niabot, there is nothing in this policy that prevents you from nominating your Futanari manga image for featured image, or would give anyone in the project grounds to disallow such a nomination if the image is deemed to be within scope. The FPC/QI/VI process is nowhere mentioned in this policy. I don't see how FPC/QI/VI is a valid reason for opposing. --JN466 11:06, 9 July 2010 (UTC)
- Support. Agree with poll statement, above, by Dcoetzee (talk · contribs), and with wording of the page Commons:Sexual content. -- Cirt (talk) 18:10, 9 July 2010 (UTC)
No consensus. As noted above, the proposal at the time of the poll failed to gain consensus due to new input. I'd like to address the remaining issues and begin a new poll at a future time. Please leave any new comments in a new section! Dcoetzee (talk) 18:24, 9 July 2010 (UTC)