Help - Search - Members - Calendar
Full Version: Mature content filter?
> Wikimedia Discussion > General Discussion
Lifebaka
As I brought up here, one thing that would help a lot with Wikipedia's public image over the whole "mature content" issue is some sort of filter that would auto-hide pictures or articles that probably aren't suitable for minors (despite their own insistence to the contrary). It could of course be manually overridden on a per-view basis. What I'm envisioning is a message saying something like "This <image, page, section, etc.> is not displayed because it contains mature content. Click [here] to display the <image, page, section, etc.> anyway." On the technical side, something like Template:Hidden could easily be used to actually hide the content.

Ideally, this would work as an extension of the MediaWiki software, so that the filter can be manually turned off in the user preferences for registered users (yes, I'm aware that some users are also minors, but the internet can't check that). It'd be fairly easy to implement without a software fix, though, simply adding {{hidden}} where needed.

I'd like to see some discussion on this or similar ideas before I take it on-wiki, so that I'll have a better proposal to start off with there.
Kato
People have no problems with such filters appearing in every other walk of life. And they are present on other prominent sites like Youtube and Flickr, neither of which make any play on their site about "educating children", unlike Wikipedia.

But the moment such a concept gets rolled into Wikipedia's Village Idiot market place - the squawking mob invariably freaks out, and starts blogging furiously that you want to censor Botticelli's Venus.

In fact, The Venus defense should be added to Wikipedia's many ludicrous straw man arguments used to defend wanton irresponsibly, alongside the notorious Charlie Manson defense - which gets trotted out whenever BLP opt-out is mentioned, and some Wiki-goon gets the idea that Charlie Manson (?!) is going to contact the WMF and ask for his article to be removed.

It is easy to develop a drop down box to cover really nasty images. And before Wikipediots start blubbing WP:NOT CENSORED. That is complete garbage. Wikipedia rightfully removed images of naked mutilated dead children in Sri Lanka without a thought. And "censorship" of all manner of written or visual material is rife on WP. It's called editorial restraint.
Son of a Yeti
QUOTE(Kato @ Thu 11th December 2008, 11:53pm) *

It is easy to develop a drop down box to cover really nasty images. And before Wikipediots start blubbing WP:NOT CENSORED. That is complete garbage. Wikipedia rightfully removed images of naked mutilated dead children in Sri Lanka without a thought. And "censorship" of all manner of written or visual material is rife on WP. It's called editorial restraint.


An adult filter is worthless if it simply orders me to say whether I'm 18 or not. But it may cover WP ass in the legal way (I'm not a lawyer so don't quote me).

However, I do not wish to be ambushed with images of mutilated bodies of any age. It's not only a problem of porn, even less of underage porn. There are many more classes of disturbing images one may opt out of.

In my case it would be:

porn - Yes, mutilated bodies - No.
LessHorrid vanU
I wouldn't be adverse to some sort of filtering system, although since it is an encyclopedia I would prefer it to be an opt-in system; school ip's can opt in for image discretion, those adverse to certain representations (I recall Armed Blowfish choosing text only to avoid some things that she preferred not to see) can opt in, even some subjects may come with an opt in filter. I would prefer that the default be open - in that WP is supposed to be able to be edited by anyone, and removing that by requiring boxes to be ticked would intefere with that. Also, once you have a system by which content can be viewed by choosing to view then it would be far more easily be pressurised into making viewing requirements more stringent; if one would make images of Mohammed viewable by choice then certain Muslim parties would concentrate on making that choice more difficult rather than the lost cause of removing such images, however I would support a "Muslim filter" (not readable by WP or the ISP) that could be enabled by anyone not wishing to offend their own sensibilities.

The trouble with such a filtering system is where, yet again, would you draw the line? Filters for sexual content, for religious reasons, and a couple more fairly easily agreed subjects/situations would be fine - but you may have scientific orientated editors demanding filters for fringe/pseudo science articles (and vice versa) or political/national zealots pushing for filters of opposing interests. To be honest, it is unlikely that the community would ever be able to agree on the basics of more than the two specific areas I note above, and there would likely be much teddy bear throwing and dramatic exits over where the line is drawn even then.
dogbiscuit
QUOTE(LessHorrid vanU @ Fri 12th December 2008, 8:33pm) *

I wouldn't be adverse to some sort of filtering system, although since it is an encyclopedia I would prefer it to be an opt-in system; school ip's can opt in for image discretion, those adverse to certain representations (I recall Armed Blowfish choosing text only to avoid some things that she preferred not to see) can opt in, even some subjects may come with an opt in filter. I would prefer that the default be open - in that WP is supposed to be able to be edited by anyone, and removing that by requiring boxes to be ticked would intefere with that. Also, once you have a system by which content can be viewed by choosing to view then it would be far more easily be pressurised into making viewing requirements more stringent; if one would make images of Mohammed viewable by choice then certain Muslim parties would concentrate on making that choice more difficult rather than the lost cause of removing such images, however I would support a "Muslim filter" (not readable by WP or the ISP) that could be enabled by anyone not wishing to offend their own sensibilities.

The trouble with such a filtering system is where, yet again, would you draw the line? Filters for sexual content, for religious reasons, and a couple more fairly easily agreed subjects/situations would be fine - but you may have scientific orientated editors demanding filters for fringe/pseudo science articles (and vice versa) or political/national zealots pushing for filters of opposing interests. To be honest, it is unlikely that the community would ever be able to agree on the basics of more than the two specific areas I note above, and there would likely be much teddy bear throwing and dramatic exits over where the line is drawn even then.

I find it amazing how limited people's thinking is on this issue. Categorise the information and you can let the readership draw any line they wish on their own terms, rather than have the Wikipedia "fuck your sensibilities, and I'll ram it (what ever it may be) down your throat" idiocy that I see you display here. I am stunned how blind Wikipedians are on this issue: total failure to get the difference between data gathering and publication.

It's only a database you know. You can do things with them.
JoseClutch
Dogbiscuit,

The standard Wikipedian's answer to this is already known: "{{sofixit}}".

Anyone can reproduce the database and add whatever content filters they like. There are plenty of mirrors already, and as long as they halfway comply with the GFDL (or have at least heard of it) nobody says boo.
GlassBeadGame
QUOTE(JoseClutch @ Fri 12th December 2008, 5:23pm) *

Dogbiscuit,

The standard Wikipedian's answer to this is already known: "{{sofixit}}".

Anyone can reproduce the database and add whatever content filters they like. There are plenty of mirrors already, and as long as they halfway comply with the GFDL (or have at least heard of it) nobody says boo.


That may be "the standard Wikipedian answer" but to people working to hold your irresponsible top ten website accountable it is a pretty inadequate response.
Silly Fake Name
QUOTE(LessHorrid vanU @ Fri 12th December 2008, 8:33pm) *

The trouble with such a filtering system is where, yet again, would you draw the line? Filters for sexual content, for religious reasons, and a couple more fairly easily agreed subjects/situations would be fine - but you may have scientific orientated editors demanding filters for fringe/pseudo science articles (and vice versa) or political/national zealots pushing for filters of opposing interests. To be honest, it is unlikely that the community would ever be able to agree on the basics of more than the two specific areas I note above, and there would likely be much teddy bear throwing and dramatic exits over where the line is drawn even then.


Nazism and holocaust denial are illegal in Germany and France.

Many Germans might wish to avoid seeing the swastika.
One
To newer Wikipedians: this has been thought of before.

There was a lot of commotion about this in early 2005. While I thought the proposals were sensible, they were roundly opposed and the carcasses of several would-be policies can be seen at Wikipedia:Graphic and potentially disturbing images.

Basically, there is no consensus for making editorial judgments about which images might offend people. According to one train of thought, it's simply cultural imperialism and stands against everything that wikipedia stands for. Me? I think dogbiscuit is right. The encyclopedia writing should be thought of as a separate task than distribution.
tarantino
QUOTE(LessHorrid vanU @ Fri 12th December 2008, 8:33pm) *


The trouble with such a filtering system is where, yet again, would you draw the line? Filters for sexual content, for religious reasons, and a couple more fairly easily agreed subjects/situations would be fine - but you may have scientific orientated editors demanding filters for fringe/pseudo science articles (and vice versa) or political/national zealots pushing for filters of opposing interests. To be honest, it is unlikely that the community would ever be able to agree on the basics of more than the two specific areas I note above, and there would likely be much teddy bear throwing and dramatic exits over where the line is drawn even then.


I think an easy first step would be putting hidden templates in pages with mature subject matter such as autofellatio or dirty sanchez. Then those pages could easily be blocked with a proxy. I know I used one because I didn't think it was desirable for people in my home to see some guy blowing himself or corprophagia just because they misspelled an address and ended up on Wikipedia or Gerard's server.
Lifebaka
QUOTE(One @ Fri 12th December 2008, 5:36pm) *

To newer Wikipedians: this has been thought of before.

There was a lot of commotion about this in early 2005. While I thought the proposals were sensible, they were roundly opposed and the carcasses of several would-be policies can be seen at Wikipedia:Graphic and potentially disturbing images.

Basically, there is no consensus for making editorial judgments about which images might offend people. According to one train of thought, it's simply cultural imperialism and stands against everything that wikipedia stands for. Me? I think dogbiscuit is right. The encyclopedia writing should be thought of as a separate task than distribution.

Hmm, food for thought. I haven't seen it brought up recently, though, so I still think it's worth trying again. Haven't read over the whole of the proposals, though.

I've only so far been hoping to get this approved for images showing the genitals or mutilated bodies, as hopefully narrow enough for the community to agree on it. The proposal would not include anything else, as it would bog down and probably stop the entire thing in its tracks.

Anyways, I'm workin' on a little something at User:Lifebaka/Sandbox/Mature content in case the idea does get accepted (for which I don't hold my breath, but...). Anyone who wants to help me out with it would be great, 'cuz I don't know much CSS myself. So far it's mostly doing what I'd like it to do, but I can't figure out how to align the entire thing over to the right.
Silly Fake Name
QUOTE(One @ Fri 12th December 2008, 10:36pm) *

To newer Wikipedians: this has been thought of before.


This website is not Wikipedia. This website is Wikipedia Review.
GlassBeadGame
QUOTE(Silly Fake Name @ Fri 12th December 2008, 6:16pm) *

QUOTE(One @ Fri 12th December 2008, 10:36pm) *

To newer Wikipedians: this has been thought of before.


This website is not Wikipedia. This website is Wikipedia Review.


He is just speaking to his constituents.
JoseClutch
QUOTE(GlassBeadGame @ Fri 12th December 2008, 5:27pm) *

QUOTE(JoseClutch @ Fri 12th December 2008, 5:23pm) *

Dogbiscuit,

The standard Wikipedian's answer to this is already known: "{{sofixit}}".

Anyone can reproduce the database and add whatever content filters they like. There are plenty of mirrors already, and as long as they halfway comply with the GFDL (or have at least heard of it) nobody says boo.


That may be "the standard Wikipedian answer" but to people working to hold your irresponsible top ten website accountable it is a pretty inadequate response.

Wikipedia is designed (explicitly from the outset, less prominantly today) to be used in the way I am suggesting, though. It is not supposed to be a finished product (Nupedia is), and the ability to remake it for a variety of purposes. Losing site of the purpose of the whole deal may cloud one's vision.

And it does come out of the free culture movement, which is explicitly a "here are some tools, build what you like" school of thought.

A "It's not enough that you hand out a free encyclopedia, you also need to provide a comprehensive support package, including ..." is not a very compelling argument either. Entitlement is not a very empathizable emotion.
GlassBeadGame
QUOTE(JoseClutch @ Fri 12th December 2008, 6:29pm) *


Wikipedia is designed (explicitly from the outset, less prominantly today) to be used in the way I am suggesting, though. It is not supposed to be a finished product (Nupedia is), and the ability to remake it for a variety of purposes. Losing site of the purpose of the whole deal may cloud one's vision.

And it does come out of the free culture movement, which is explicitly a "here are some tools, build what you like" school of thought.

A "It's not enough that you hand out a free encyclopedia, you also need to provide a comprehensive support package, including ..." is not a very compelling argument either. Entitlement is not a very empathizable emotion.


I mostly agree with your sense of the orgins of the Wikipedian mindset. I would add the toxic influence of Usenet and of course MMORPGs. I don't have any sympathy at all for the expected response of "don't burden us further, we are doing such a valuable service to mankind already" that seems to always follow any attempt to impose any cost or burden need to conduct business in a responsible manner. Wikipedia is simply not such an wondrous boon to humanity.
Silly Fake Name
QUOTE(JoseClutch @ Fri 12th December 2008, 10:23pm) *

Dogbiscuit,

The standard Wikipedian's answer to this is already known: "{{sofixit}}".

Anyone can reproduce the database and add whatever content filters they like. There are plenty of mirrors already, and as long as they halfway comply with the GFDL (or have at least heard of it) nobody says boo.


A wise man named Antoine de Saint-Exupery once wrote "Il semble que la perfection soit atteinte non quand il n'y a plus rien à ajouter, mais quand il n'y a plus rien à retrancher." This means "A designer knows he has achieved perfection not when there is nothing left to add, but when there is nothing left to take away."

See "Terre des Hommes".
dogbiscuit
My response to the {{so fix it}} issue is that if Wikipedia determined that it was not a publisher, then it would be appropriate for this to be someone else's problem. It clearly does publish, and very effectively too, so it is nonsensical to pretend that it is someone else's problem to "unpublish" elements of their work.
Kato
The "{{sofixit}}" argument has no credibility at all. That's what gets people into so much trouble in the first place.

Passers-by take a look at Wikipedia, see something that obviously needs correcting and apply "{{sofixit}}", and the next thing they know they're entangled in some crazed Orwellian nightmare - swarmed by cult-warped cretins bombarding them with jargonistic acronyms - and ultimately finding their attempts at defending themselves appearing at the top of a google search like a flag of shame.

Or worse.
cyofee
The problem here is that editors of Wikipedia don't see any responsibility for making a policy about offensive images and/or content. They're unpaid volonteers, they're indie and rebellious, people should be grateful to them for what they're doing already, why should they bother and make a very tough choice that goes against their beliefs anyway?

The Wikipedia Foundation, on the other hand, won't touch the problem, but bounces the question right back to the community. They don't want to make policy, the community should do that.

The way out of this would either be a serious lawsuit problem for the WMF, which would force them to react, or the community growing up a bit and realizing that some kind of content filtering is indeed necessary. While this will come naturally to most editors as they leave their adolescence, there is a danger of it being negated by the exodus of newly-mature editos and the influx of new teens who will concentrate on giving each other barnstars and making their userpages look as good as possible, but doing potentially terrible things. (example: Shapiros10)
privatemusings
(I reckon this thread is pretty much a good fit?)

I've been using my interest in sexual content on wikipedia as as excuse to browse dirty pics for the last couple of months, and have now written up an essay about it.

I started a couple of proposals a couple of months back, both on Wikipedia and on Commons - the responses have been universally negative - some people seem to sincerely believe that not only is there nothing to talk about in this area, but that anyone who disagrees, and would like to raise a problem or two, is just plain nuts.
GlassBeadGame
QUOTE(privatemusings @ Sun 18th January 2009, 10:09pm) *

(I reckon this thread is pretty much a good fit?)

I've been using my interest in sexual content on wikipedia as as excuse to browse dirty pics for the last couple of months, and have now written up an essay about it.

I started a couple of proposals a couple of months back, both on Wikipedia and on Commons - the responses have been universally negative - some people seem to sincerely believe that not only is there nothing to talk about in this area, but that anyone who disagrees, and would like to raise a problem or two, is just plain nuts.


In my opinion you are way to good for "hey I wrote an essay on Wikipedia." Think through the issue and then approach Kato about authoring a blog piece here. You will be able to develop your ideas without the nonsense of wiki-process. Create opinion here. Import to Wikipedia whole and intact.
Krimpet
A MediaWiki extension that lets users attach content rating metadata to given tag images and pages, which cascade to pages that include them, might be nice. (Something like those RSACi ratings that were in vogue a decade ago.)
Giggy
QUOTE(privatemusings @ Mon 19th January 2009, 1:09pm) *

... some people seem to sincerely believe that not only is there nothing to talk about in this area, but that anyone who disagrees, and would like to raise a problem or two, is just plain nuts.

The problem is that many people sincerely think you're nuts anyway. I'm not saying you are (I think you're perfectly sane, myself), but that any proposal from you will be a non-starter because you aren't in the good books of any of the significant factions that judge if a proposal is (or is not) accepted.
Lifebaka
QUOTE(privatemusings @ Sun 18th January 2009, 10:09pm) *

(I reckon this thread is pretty much a good fit?)

I've been using my interest in sexual content on wikipedia as as excuse to browse dirty pics for the last couple of months, and have now written up an essay about it.

I started a couple of proposals a couple of months back, both on Wikipedia and on Commons - the responses have been universally negative - some people seem to sincerely believe that not only is there nothing to talk about in this area, but that anyone who disagrees, and would like to raise a problem or two, is just plain nuts.


The ideas you lay out in the essay are good. I hadn't thought of putting a tag on the images to allow network admins to stop them from displaying. I don't know how it would be implemented technically, but I trust the developers. Ideally we could then have an option in the user preferences to turn on or off the display of images with the tag, with the images not displaying as the default. For registered users, the preference could also be made so that it can be manually overridden on a per-pageview basis. Just sorta' thinking "out loud" here on what I'd like to see on the technical end.
Milton Roe
QUOTE(Lifebaka @ Sun 18th January 2009, 10:42pm) *

QUOTE(privatemusings @ Sun 18th January 2009, 10:09pm) *

(I reckon this thread is pretty much a good fit?)

I've been using my interest in sexual content on wikipedia as as excuse to browse dirty pics for the last couple of months, and have now written up an essay about it.

I started a couple of proposals a couple of months back, both on Wikipedia and on Commons - the responses have been universally negative - some people seem to sincerely believe that not only is there nothing to talk about in this area, but that anyone who disagrees, and would like to raise a problem or two, is just plain nuts.


The ideas you lay out in the essay are good. I hadn't thought of putting a tag on the images to allow network admins to stop them from displaying. I don't know how it would be implemented technically, but I trust the developers. Ideally we could then have an option in the user preferences to turn on or off the display of images with the tag, with the images not displaying as the default. For registered users, the preference could also be made so that it can be manually overridden on a per-pageview basis. Just sorta' thinking "out loud" here on what I'd like to see on the technical end.

Interesting thoughts. If there's a way to semi-protect articles so that IPs cannot edit them, I wonder if there's not some way you can tweak the software to flag articles in a different way (I don't have a name for it) such that IP accounts cannot VIEW them? (These would be two completely different flags, so that a given article could carry one, or the other, or neither, or both).

This isn't going to affect most of WP for IP viewers, but it will block IP viewers from the "adult content" part. So what? Will the Earth end?

Now, kids can still go home and register some account and look at this stuff over their home computers. But now they're dealing with mommy and daddy's draconian software browser filters, and whatever gets through THAT, is mommy and daddy's problem.

There's another function that might be written in: a way of blocking an IP from creating nameusers, as is done now if the IP is rangeblocked, but this might be invoked even if the IP is NOT blocked. At a school's request, for instance (a simple note on the IP talk page) a flag could be set so that THAT IP cannot be used to create nameusers. And thus cannot be used to view any mature content articles which are protected from viewing by IPs.

A start.

Perhaps we could call mature-content articles which are flagged so as to be only name-user viewable, as being "semi-viewable"? Sviewable vs. sprotected? It's sort of a super-symmetry there.

Swonderful, smarvelous.... wink.gif
privatemusings
thanks for your (unexpected) kind words, glass - I'd be tickled pink to write something up worthy of a blog post here, I'll think about it a bit more, though I do often sort of leave things unfinished - plus I'm trying quite hard to actually engage people on this one over there at the mo.... it's not proving easy, and perhaps a more forthright 'this is irresponsible, and here's why' type angle might be a good one.

I should also point out to anyone feeling that there may be a light at the end of the tunnel, that it's unfortunately quite likely to be an oncoming train - particularly if you're into the idea of 'image tagging' at all - it's a huge wiki failing (and I know I'm preaching to the choir here), but once an idea has been considered and rejected, a sadly large number of wiki types will be unable to consider the merits of any discussion / argument / proposal for... well.. basically ever.

I just have a feeling that this is something that's going to blow up in the mainstream media at some point - I felt that at the time Eric's rather (to borrow a phrase from another thread) 'lower 6th' writing got some attention, I felt it when the IWF thing happened, and despite being wrong on each occasion, still feel that it's a matter of time.....

Milton Roe
QUOTE(privatemusings @ Mon 19th January 2009, 2:55am) *

thanks for your (unexpected) kind words, glass - I'd be tickled pink to write something up worthy of a blog post here, I'll think about it a bit more, though I do often sort of leave things unfinished - plus I'm trying quite hard to actually engage people on this one over there at the mo.... it's not proving easy, and perhaps a more forthright 'this is irresponsible, and here's why' type angle might be a good one.

I should also point out to anyone feeling that there may be a light at the end of the tunnel, that it's unfortunately quite likely to be an oncoming train - particularly if you're into the idea of 'image tagging' at all - it's a huge wiki failing (and I know I'm preaching to the choir here), but once an idea has been considered and rejected, a sadly large number of wiki types will be unable to consider the merits of any discussion / argument / proposal for... well.. basically ever.

I just have a feeling that this is something that's going to blow up in the mainstream media at some point - I felt that at the time Eric's rather (to borrow a phrase from another thread) 'lower 6th' writing got some attention, I felt it when the IWF thing happened, and despite being wrong on each occasion, still feel that it's a matter of time.....

Images are difficult, so I understand, but it's quite difficult to access them without going through an article which contains them (possible, but you have to know what you're doing-- it requires nearly adult sophistication, and that's sort of a filter by itself, no?). So if we simply tag mature-content aticles (those that contain any mature content images) as "Semi-viewable" (only name users can bring them up) that takes care of most of the image problem right there. It's not a perfect solution, but it's a semi-solution, and one that I think should be implementable without a lot of difficulty.

Of course, you know what will happen: somebody will want to stick an xxx image into a G or PG-13 article, and that will cause the whole thing to be re-rated "name-viewers only." So somebody will have to either do that, or else the image will have to be taken out, so it can be seen by IPs again. But again, not the end of the world.
Lifebaka
QUOTE(Milton Roe @ Mon 19th January 2009, 4:54am) *

Interesting thoughts. If there's a way to semi-protect articles so that IPs cannot edit them, I wonder if there's not some way you can tweak the software to flag articles in a different way (I don't have a name for it) such that IP accounts cannot VIEW them? (These would be two completely different flags, so that a given article could carry one, or the other, or neither, or both).

This isn't going to affect most of WP for IP viewers, but it will block IP viewers from the "adult content" part. So what? Will the Earth end?

Now, kids can still go home and register some account and look at this stuff over their home computers. But now they're dealing with mommy and daddy's draconian software browser filters, and whatever gets through THAT, is mommy and daddy's problem.

There's another function that might be written in: a way of blocking an IP from creating nameusers, as is done now if the IP is rangeblocked, but this might be invoked even if the IP is NOT blocked. At a school's request, for instance (a simple note on the IP talk page) a flag could be set so that THAT IP cannot be used to create nameusers. And thus cannot be used to view any mature content articles which are protected from viewing by IPs.

A start.

Perhaps we could call mature-content articles which are flagged so as to be only name-user viewable, as being "semi-viewable"? Sviewable vs. sprotected? It's sort of a super-symmetry there.

Swonderful, smarvelous.... wink.gif

I don't know if the software supports disallowing unregistered users from viewing specific pages, but I know there's a setting which would prevent unregistered users from viewing any pages besides the main page. It might also be able to be set on a per-namespace basis, though I doubt it can be set on a per-page basis. I don't think it would be too difficult to change the software to support this sort of thing. Actually, I believe FlaggedRevisions could, in theory, be implemented to cause this effect.

If a school really wants to prevent people from editing, signing up, or logging in, they could just use a proxy or other such that disables submitting forms on wikipedia.org or its subdomains (or just en.wikipedia.org, depending on how wide they want it). Most schools already use something like that (judging by my own experience) to disable viewing and submissions on other websites, so it would be pointless to add another method by which they could.
tarantino
<sarcasm> I don't know about this proposal. Even pre-teens need to learn from a scholarly source that many women are happy to have semen sprayed on their faces. Fight the good fight Seedfeeder! </sarcasm>
EricBarbour
QUOTE(tarantino @ Wed 21st January 2009, 4:56pm) *

<sarcasm> I don't know about this proposal. Even pre-teens need to learn from a scholarly source that many women are happy to have semen sprayed on their faces. Fight the good fight Seedfeeder! </sarcasm>

I blogged that.

And interestingly, 5 days later Elonka deleted it, supposedly at the request of Seedfeeder.
Looks like he complained about it on his userpage--after deleting a long talk about the images, plus something about creating an "Anti-Censorship Barnstar".

What the hell is going on here?
Casliber
QUOTE(Giggy @ Mon 19th January 2009, 3:18pm) *

QUOTE(privatemusings @ Mon 19th January 2009, 1:09pm) *

... some people seem to sincerely believe that not only is there nothing to talk about in this area, but that anyone who disagrees, and would like to raise a problem or two, is just plain nuts.

The problem is that many people sincerely think you're nuts anyway. I'm not saying you are (I think you're perfectly sane, myself), but that any proposal from you will be a non-starter because you aren't in the good books of any of the significant factions that judge if a proposal is (or is not) accepted.


I thought about this,....PM has a habit of being gratuitously cheeky biggrin.gif

I was also musing on some sort of intrawiki adult filter/netnanny thingy, that way it is still uncensored and yet can be made (relatively) safer
tarantino
Part 5, Section 63 of the Criminal Justice and Immigration Act 2008 goes into effect 26 January 2009 in the UK and makes it a criminal offense to possess images that portray
a) an act which threatens a person’s life,
b) an act which results, or is likely to result, in serious injury to a person’s anus, breasts or genitals,
c) an act which involves or appears to involve sexual interference with a human corpse,
d) a person performing or appearing to perform an act of intercourse or oral sex with an animal (whether dead or alive),

and a reasonable person looking at the image would think that any such person or animal was real.

I wonder if Commons or WP hosts any such material?
GlassBeadGame
QUOTE(tarantino @ Sat 24th January 2009, 12:06pm) *

Part 5, Section 63 of the Criminal Justice and Immigration Act 2008 goes into effect 26 January 2009 in the UK and makes it a criminal offense to possess images that portray
a) an act which threatens a person’s life,
b) an act which results, or is likely to result, in serious injury to a person’s anus, breasts or genitals,
c) an act which involves or appears to involve sexual interference with a human corpse,
d) a person performing or appearing to perform an act of intercourse or oral sex with an animal (whether dead or alive),

and a reasonable person looking at the image would think that any such person or animal was real.

I wonder if Commons or WP hosts any such material?


When I first read this I thought (a) was overbroad, as it would prohibit much war-time, crime and civil conflict photo journalism. A look at the link however indicates that a prerequisite to application is the image must be "pornographic" which is defined as intended to cause sexual arousal in the viewer and is obscene (serves no other purpose). With that this seems like a reasonable limitation.
Sylar
QUOTE(tarantino @ Sat 24th January 2009, 5:06pm) *

Part 5, Section 63 of the Criminal Justice and Immigration Act 2008 goes into effect 26 January 2009 in the UK and makes it a criminal offense to possess images that portray
a) an act which threatens a person’s life,
b) an act which results, or is likely to result, in serious injury to a person’s anus, breasts or genitals,
c) an act which involves or appears to involve sexual interference with a human corpse,
d) a person performing or appearing to perform an act of intercourse or oral sex with an animal (whether dead or alive),

and a reasonable person looking at the image would think that any such person or animal was real.

I wonder if Commons or WP hosts any such material?


Just because a nanny state is clamping down on freedom of speech doesn't mean that Wikipedia has to.
GlassBeadGame
QUOTE(Sylar @ Sat 24th January 2009, 4:19pm) *

QUOTE(tarantino @ Sat 24th January 2009, 5:06pm) *

Part 5, Section 63 of the Criminal Justice and Immigration Act 2008 goes into effect 26 January 2009 in the UK and makes it a criminal offense to possess images that portray
a) an act which threatens a person’s life,
b) an act which results, or is likely to result, in serious injury to a person’s anus, breasts or genitals,
c) an act which involves or appears to involve sexual interference with a human corpse,
d) a person performing or appearing to perform an act of intercourse or oral sex with an animal (whether dead or alive),

and a reasonable person looking at the image would think that any such person or animal was real.

I wonder if Commons or WP hosts any such material?


Just because a nanny state is clamping down on freedom of speech doesn't mean that Wikipedia has to.


More likely to emulate a spoiled brat state in the tolerance of cruelty and exploitation as long as possible.
Tarc
QUOTE
b) an act which results, or is likely to result, in serious injury to a person’s anus, breasts or genitals


Well thank god that images of skull-fucking will still be allowed. blink.gif

Never seen the point of getting that particular with lists of naughty bits not allowed, as there's always be a one-upper to come along.
tarantino
QUOTE(Tarc @ Tue 27th January 2009, 7:12pm) *

Well thank god that images of skull-fucking will still be allowed. blink.gif

Never seen the point of getting that particular with lists of naughty bits not allowed, as there's always be a one-upper to come along.

I think that would probably fall under "an act which threatens a person’s life,"
QUOTE(Jeremy Grawp @ Sat 24th January 2009, 9:19pm) *

Just because a nanny state is clamping down on freedom of speech doesn't mean that Wikipedia has to.


So, you think images of people having sex wiith corpses is protected speech? (I doubt they would last long on Wikipedia anyways) How about images of child molestation?

Another question, do you have any boundaries?
JoseClutch
I actually find it rather surprising such labels are not very popular among Wikipedians. Once implemented, you could write a bot to flag everything as potentially offensive content, with every warning label in the set, and get rid of everyone who wants their content censored.

Seems like "win-win" to me.
tarantino
QUOTE(EricBarbour @ Thu 22nd January 2009, 5:56am) *

QUOTE(tarantino @ Wed 21st January 2009, 4:56pm) *

<sarcasm> I don't know about this proposal. Even pre-teens need to learn from a scholarly source that many women are happy to have semen sprayed on their faces. Fight the good fight Seedfeeder! </sarcasm>

I blogged that.

And interestingly, 5 days later Elonka deleted it, supposedly at the request of Seedfeeder.
Looks like he complained about it on his userpage--after deleting a long talk about the images, plus something about creating an "Anti-Censorship Barnstar".

What the hell is going on here?


Seedfeeder googled his user name, and came across the discussions here.

His response seems fairly reasonable.
QUOTE
Don't you know kids might see your drawings?

By kids, I will assume you mean small children (because If you're talking about teenagers, then they've already seen worse). Yes, I am aware, and it is troubling. But I rather a child see one of my images than an actual photograph of the same act. Illustrations provide parents with a plausible "out". Imagine your child stumbles across a drawing of a sexual act and they question you on it. You can simply explain it away as "Oh that's just a naughty picture somebody drew trying to be silly, don't worry about it...". That explaination doesn't carry as much water when it's a real photgraph.

In any case, the subject should be directed to the Wikimedia Foundation. They could provide a very simple technical solution to the problem of offensive images and articles on Wikipedia. A solution that would make everyone happy. It would take no work at all for Wikipedia to add a ".sex" or ".adult" sub-domain to the project to house all adult content. This would allow IT managers at schools and workplaces to easily block those sections of Wikipedia, while leaving the rest of the site accessible. Having a sub-domain would also afford browser based parental controls the same luxury. It would also drastically cut down on page re-directs being used as vandalism.
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.