Help - Search - Members - Calendar
Full Version: Image filter referendum
> Wikimedia Discussion > General Discussion
Pages: 1, 2, 3
carbuncle
the WMF is holding a referendum on implementing the image filter that everyone with a clue knew was going to be implemented when it was "recommended" in that report to the WMF. In order for it to be as ineffective as possible, it will be opt-in. So unless you are a logged-in user who has set a preference, you will see all images unless you take action (through a dialog box that sets a browser cookie). Were you surprised by that photo of a man being anally fisted on Fisting? You were? Then click here to not see it again.

If you enjoy seeing people talk about "freedom of speech" and "censorship" while strenuously arguing that people should not be able to chose not to see images that may offend them, don't miss the fun on the talk page!
Michaeldsuarez
QUOTE(carbuncle @ Tue 16th August 2011, 1:11pm) *
[...] that everyone with a clue knew was going to be implemented when it was "recommended" in that report to the WMF.


Isn't that what everyone assumed about PendingChanges? The anti-peer review crowd (believing that they were fighting political censorship instead of editorial responsibility) were able to beat it. Perhaps this feature won't get any further than the field testing stage as well.
carbuncle
QUOTE(Michaeldsuarez @ Tue 16th August 2011, 5:35pm) *

QUOTE(carbuncle @ Tue 16th August 2011, 1:11pm) *
[...] that everyone with a clue knew was going to be implemented when it was "recommended" in that report to the WMF.


Isn't that what everyone assumed about PendingChanges? The anti-peer review crowd (believing that they were fighting political censorship instead of editorial responsibility) were able to beat it. Perhaps this feature won't get any further than the field testing stage as well.

Pending changes on en.wiki never really had the support of the WMF. This was something a WMF-chosen and WMF-paid consultant put into a WMF-commissioned report. I may be cynical, but I don't think it would have been in that report if the WMF had not wanted that particular conclusion. (I say this having read the published drafts and seeing the method used for preparing the final report. )
Michaeldsuarez
http://meta.wikimedia.org/w/index.php?titl...5&oldid=2801133

They're now resorting to using extreme examples. Someone else even tried comparing the proposal to the world of 1984. An user known as "Eingangskontrolle" somehow believes that this will allow Creationists to get rid of Darwin (as if they could actually get away with shit like that without someone noticing). The anti-censorship mob seems to think that this proposal will lead Wikimedia down an apocalyptic doom for some reason. They're overreating.
lilburne
Seems to me that the anti-censorship mob really ought to read what censorship actually is before they trumpet shite.
Michaeldsuarez
QUOTE(Michaeldsuarez @ Tue 16th August 2011, 3:14pm) *

http://meta.wikimedia.org/w/index.php?titl...5&oldid=2801133

They're now resorting to using extreme examples. Someone else even tried comparing the proposal to the world of 1984. An user known as "Eingangskontrolle" somehow believes that this will allow Creationists to get rid of Darwin (as if they could actually get away with shit like that without someone noticing). The anti-censorship mob seems to think that this proposal will lead Wikimedia down an apocalyptic doom for some reason. They're overreating.


http://meta.wikimedia.org/w/index.php?titl...2&oldid=2802143

This is quickly becoming absurd. Can't they come up with sensible hypothetical scenarios? The answer to all these nonsensical scenarios is common sense.
EricBarbour
QUOTE(Michaeldsuarez @ Tue 16th August 2011, 5:09pm) *
This is quickly becoming absurd. Can't they come up with sensible hypothetical scenarios? The answer to all these nonsensical scenarios is common sense.

For the nth time, Michael, we've been over this dirt road before. Every time this stuff comes up, a bunch of NOTCENSORED nerds jumps up and screams. They are not "sensible" and there is no "sense".

The only thing they can't seem to control is the debate over pedophilic images. That stuff causes too much legal concern at the WMF. And even so, the Von Gloeden and Von Pluschow photos are still on Commons.

Not disgusting enough? How about Category: Genital modification?
The Joy
I did vote, but the last question about trying to create a "culturally neutral" solution threw me off. How can anything be "culturally neutral?" I don't think that's possible.

I never understood why those drop-down boxes Wikipedians use for "archiving" or stopping a conversation couldn't be used for images. Make them all closed by default and the reader is only one click away from seeing the content. Simple, right? And yet one tiny obstacle that can be undone by one click is somehow "censorship" and the equivalent of Kim Jong-il's North Korea? Censorship would be preventing anyone from seeing it at all. No more people yelling at AN/I about disturbing photos! No more people yelling about "censorship!" No worrying about the remote possibility of the WMF being sued over them! Shankbone can published penises all he wants and no one would complain because they would be hidden from young eyes! It's a good solution for everyone. An opt-out drop-down cover for all images would be the best solution.

Am I preaching to the choir, again? I think I am. letsgetdrunk.gif frustrated.gif

Edit: Well, maybe "all" images would be too much, but I can't see how Wikipedians would choose "culturally neutral" and "inoffensive" photos from the rest. One man's porn is another man's art. There would have to be some solid criteria for which images and categories are chosen. I just don't think Wikipedians can do it with the way things are.
carbuncle
This seems to be a typical WMF clusterfuck. They are holding what they call a "referendum" to gather input. The wording associated with the referendum is vague, perhaps purposefully so, and those commenting on the talk page do not appear to grasp that the WMF has decreed that AN IMAGE FILTER WILL BE IMPLEMENTED. WMF employee Philippe has been active on the talk page, but not to clear up people's misconceptions. He is attempting to shut down discussions about having the filters on by default. The referendum is ostensibly about gathering information that will assist in design and implementation. I say ostensibly because Erik Moeller has posted some information that suggests they have already decided how to code this.

All of this is putting the cart before the horse because the real issue here is deciding what images get filtered. They seem set on using a small number of categories, but no hint is given if an image can appear in more than one category. That simple decision may have a huge effect on design and coding. What will the categories be? How does an image get categorized? Who can categorize an image? Is there one system for all wikis or will each wiki have its own lists of potentially offensive images?

It goes without saying that there are likely to be pitched battles over categorizing images for filtering. I assume that however things get categorized, it will be done on Commons, which seems to be the home of the most hardcore "anti-censorship" zealots. I am sure there will be several folks who flounce off the project with long screeds left on their talk pages.
lilburne
The smaller the number of categories to filter the better. After all one can always 'reveal' if one wants to. One shouldn't allow an explosion of categories, one should be able to turn off all porn images not just gay ones but leave lesbian. One should be able to turn off all images of mutilated bodies, not just images of mutilated Caucasians. The images categorized should be determined by those that don't want to see them, by a simple majority. After all those taht want to see them can either click to reveal or reveal by choice.

All the rest is whiny fuckwittery.
carbuncle
QUOTE(lilburne @ Wed 17th August 2011, 12:38pm) *

The smaller the number of categories to filter the better. After all one can always 'reveal' if one wants to. One shouldn't allow an explosion of categories, one should be able to turn off all porn images not just gay ones but leave lesbian. One should be able to turn off all images of mutilated bodies, not just images of mutilated Caucasians. The images categorized should be determined by those that don't want to see them, by a simple majority. After all those taht want to see them can either click to reveal or reveal by choice.

All the rest is whiny fuckwittery.

I don't disagree, but if I don't want to see "medical" images, but I'm quite fine with sexually explicit material, I still don't want to end up seeing images of penis surgery because someone has classified it as "sexual" instead of "medical" and it can't be in both.
SB_Johnny
QUOTE(carbuncle @ Wed 17th August 2011, 8:18am) *

It goes without saying that there are likely to be pitched battles over categorizing images for filtering. I assume that however things get categorized, it will be done on Commons, which seems to be the home of the most hardcore "anti-censorship" zealots. I am sure there will be several folks who flounce off the project with long screeds left on their talk pages.

It will be interesting to see what happens when some parent turns on the filter, only to have it circumvented by some freikulture zealot. Will the courts see the existence of a filter as intended to provide an assurance? popcorn.gif
lilburne
QUOTE(carbuncle @ Wed 17th August 2011, 1:55pm) *

QUOTE(lilburne @ Wed 17th August 2011, 12:38pm) *

The smaller the number of categories to filter the better. After all one can always 'reveal' if one wants to. One shouldn't allow an explosion of categories, one should be able to turn off all porn images not just gay ones but leave lesbian. One should be able to turn off all images of mutilated bodies, not just images of mutilated Caucasians. The images categorized should be determined by those that don't want to see them, by a simple majority. After all those taht want to see them can either click to reveal or reveal by choice.

All the rest is whiny fuckwittery.

I don't disagree, but if I don't want to see "medical" images, but I'm quite fine with sexually explicit material, I still don't want to end up seeing images of penis surgery because someone has classified it as "sexual" instead of "medical" and it can't be in both.


Allow override per page. Just one click to reveal images on a page. Its not hard.

Anyway I've no real sympathy for faddishness in being too specific over the category that one finds acceptable. Because basically one ends up catering for the starbucks crowd
Michaeldsuarez
http://meta.wikimedia.org/wiki/Talk:Image_...e_be_deleted.3F

I love how Niabot avoided my assertion that those particular images are porn. What educational value do these entertaining / pornographical images add to articles? I can see how they could argue that cropped images of vulvas are educational or encyclopedic, but how are the images discussed in that section encyclopedic? Even with the real problem staring them in the face, they'll avoid facing it. I remember these people treating Larry Sanger as if he were some sort of villain when he exposed the truth: Commons was being used as a fapping material storage facility. It appears that this continues to be true. Look at the images they're protecting as "information" and "knowledge".
thekohser
I received the following automated e-mail today:

QUOTE
from Wikimedia Referendum, 2011 improve@wikimedia.org
to Thekohser <thekohser@gmail.com>
date Fri, Aug 19, 2011 at 7:15 AM
subject Image filter referendum
mailed-by wikimedia.org

Dear Thekohser,

You are eligible to vote in the image filter referendum, a referendum to gather more input into the development and usage of an opt-in personal image hiding feature. This feature will allow readers to voluntarily screen particular types of images strictly for their own accounts.

Its purpose is to enable readers to easily hide images on the Wikimedia projects that they do not wish to view, either when first viewing the image or ahead of time through individual preference settings. The feature is intended to benefit readers by offering them more choice, and to that end it will be made as user-friendly and simple as possible. We will also make it as easy as possible for editors to support. For its development, we have created a number of guiding principles, but trade-offs will need to be made throughout the development process. In order to aid the developers in making those trade-offs, we need your help us assess (sic) the importance of each by taking part in this referendum.

For more information, please see http://meta.wikimedia.org/wiki/Image_filter_referendum/en. To remove yourself from future notifications, please add your user name at http://meta.wikimedia.org/wiki/Wikimedia_nomail_list.


However, when I go to the Meta site to check eligibility requirements, it says:
QUOTE
You may vote from any one registered account you own on a Wikimedia wiki (you may only vote once, regardless of how many accounts you own). To qualify, this one account must:

* not be blocked on more than one project


You're doing a real bang-up job with that $20 million budget, Wikimedia Foundation!
Jon Awbrey
I have gotten 2 of those notices ...

so far ...

Jon tongue.gif
SpiderAndWeb
Largely agree with carbuncle. Whatever the merits of the idea in the abstract, such an image filter is unworkable in practice as people will never agree on which categories should be filtered, and which images qualify as belonging to these categories (never mind doing so centrally and in a "cultural-neutral" way).

A better idea for providing more fine-grained control over when images are displayed (and I'm not completely convinced such a feature is even a good idea to begin with) would be for shift-clicking on a blue link to load up the page without images (and for the search bar to also be somehow modified so that you can select to search for a page without bringing up images.) That way if you have delicate sensibilities and are navigating to a page whose images you suspect you will not want to see (My Lai or Fisting, say), you can easily suppress the images there.
SB_Johnny
QUOTE(SpiderAndWeb @ Fri 19th August 2011, 12:21pm) *

A better idea for providing more fine-grained control over when images are displayed (and I'm not completely convinced such a feature is even a good idea to begin with) would be for shift-clicking on a blue link...

I think you would probably have confused the little kids and elderly already with those instructions...
SpiderAndWeb
QUOTE(SB_Johnny @ Fri 19th August 2011, 5:31pm) *

QUOTE(SpiderAndWeb @ Fri 19th August 2011, 12:21pm) *

A better idea for providing more fine-grained control over when images are displayed (and I'm not completely convinced such a feature is even a good idea to begin with) would be for shift-clicking on a blue link...

I think you would probably have confused the little kids and elderly already with those instructions...


Oh come on. If you can't find the shift key, surely you also won't be able to find or configure the "Display Settings" link.
Milton Roe
QUOTE(SpiderAndWeb @ Fri 19th August 2011, 12:07pm) *

QUOTE(SB_Johnny @ Fri 19th August 2011, 5:31pm) *

QUOTE(SpiderAndWeb @ Fri 19th August 2011, 12:21pm) *

A better idea for providing more fine-grained control over when images are displayed (and I'm not completely convinced such a feature is even a good idea to begin with) would be for shift-clicking on a blue link...

I think you would probably have confused the little kids and elderly already with those instructions...


Oh come on. If you can't find the shift key, surely you also won't be able to find or configure the "Display Settings" link.

Much easier to have any illustration that you wouldn't find in the daily newspaper, hidden behind a tab in its thumb box that does "click to see (warning) graphic illustration." But being a thumbbox, you could still read the caption, to get some idea of what's behind it. The same would go for the parent image file on COMMONS.

That won't stop people who want to see violence and porn, but it fixes the zOMG shock of people who don't want to be startled by photos of headless or raped and mutilated bodies, when they want to read Nanking massacre (see-- you were warned).

As further step, some IPs are readily identified as school educational link IPs (you see this header on their TALK pages, as a "warning" to blockers of vandalism from these sites). MediaWiki could easily be programmed with a patch that kept such IPs, or any nameuser created from them, from being able to get anything when they attempt to use the "graphic image" mouseclick to see behind the "e-brown paper wrapper." The "unwrap click" would simply fail to work for some users.
KD Tries Again
I loved this:

"these images – of genital areas and sexual practices on the one hand, or mass graves and mutilated corpses on the other – will inevitably still have the power to disturb some viewers, especially if they are children"

YA THINK???
EricBarbour
QUOTE(Milton Roe @ Fri 19th August 2011, 12:23pm) *
That won't stop people who want to see violence and porn, but it fixes the zOMG shock of people who don't want to be startled by photos of headless or raped and mutilated bodies, when they want to read Nanking massacre (see-- you were warned).

My favorite bit in that article: At the top of the "In The Media" section, you don't find a book or movie about the massacre. You find this.
QUOTE
Music
American thrash metal band Exodus wrote a song about the incident titled "Nanking". The song was featured on their 2010 album Exhibit B: The Human Condition.

Just another nice little example of Wikipedia being edited by smug young males.
The Nanking Massacre is ancient history to them, "but dude, we have to mention that awesome Exodus song"!
RMHED
QUOTE(EricBarbour @ Fri 19th August 2011, 10:52pm) *

QUOTE(Milton Roe @ Fri 19th August 2011, 12:23pm) *
That won't stop people who want to see violence and porn, but it fixes the zOMG shock of people who don't want to be startled by photos of headless or raped and mutilated bodies, when they want to read Nanking massacre (see-- you were warned).

My favorite bit in that article: At the top of the "In The Media" section, you don't find a book or movie about the massacre. You find this.
QUOTE
Music
American thrash metal band Exodus wrote a song about the incident titled "Nanking". The song was featured on their 2010 album Exhibit B: The Human Condition.

Just another nice little example of Wikipedia being edited by smug young males.
The Nanking Massacre is ancient history to them, "but dude, we have to mention that awesome Exodus song"!

Wow, Exodus are such an awesome band they even named some part of the bible after them.
SB_Johnny
QUOTE(SpiderAndWeb @ Fri 19th August 2011, 12:07pm) *

QUOTE(SB_Johnny @ Fri 19th August 2011, 5:31pm) *

QUOTE(SpiderAndWeb @ Fri 19th August 2011, 12:21pm) *

A better idea for providing more fine-grained control over when images are displayed (and I'm not completely convinced such a feature is even a good idea to begin with) would be for shift-clicking on a blue link...

I think you would probably have confused the little kids and elderly already with those instructions...

Oh come on. If you can't find the shift key, surely you also won't be able to find or configure the "Display Settings" link.

Shush, child. And please try not to leave mom's car with an empty tank tomorrow morning.
QUOTE(Milton Roe @ Fri 19th August 2011, 3:23pm) *

Much easier to have any illustration that you wouldn't find in the daily newspaper, hidden behind a tab in its thumb box that does "click to see (warning) graphic illustration." But being a thumbbox, you could still read the caption, to get some idea of what's behind it. The same would go for the parent image file on COMMONS.

Yup, that would be the sensible interface. That could probably even be done by using some of those templates that teh Jimbo apparently thinks are the real problem with Wikipedia.

Actually, doing it with templates on WP might actually be a more realistic approach than relying on the (even more radically rabid) crowd on commons.
SpiderAndWeb
QUOTE

Much easier to have any illustration that you wouldn't find in the daily newspaper, hidden behind a tab in its thumb box that does "click to see (warning) graphic illustration." But being a thumbbox, you could still read the caption, to get some idea of what's behind it. The same would go for the parent image file on COMMONS.

This is a good idea, though I'd prefer a solution that doesn't involve editorial judgment squabbles over whether or not the image is "graphic." Maybe a global option to collapse all images into thumbboxes, which you can then choose to view (or not) on a case by case basis after reading the captions?

QUOTE
As further step, some IPs are readily identified as school educational link IPs (you see this header on their TALK pages, as a "warning" to blockers of vandalism from these sites). MediaWiki could easily be programmed with a patch that kept such IPs, or any nameuser created from them, from being able to get anything when they attempt to use the "graphic image" mouseclick to see behind the "e-brown paper wrapper." The "unwrap click" would simply fail to work for some users.


I disagree with this approach, though. Firstly, there had better be no false positives: if Wikipedia suddenly refused to show me any images because it incorrectly guessed my IP came from a school, I would be... displeased. Secondly, the argument on the talk page that certain governments would force their citizens to turn on irreversible image hiding if that option is available has some ring of truth to it. Lastly, it's not our responsibility to replace incompetent parents and teachers. If a kid is browsing the Internet unsupervised and is determined to see pictures of decapitated Chinese, all the better he do so on Wikipedia and maybe learn a little about the atrocities of war in the process, than on some seedier website.
lilburne
QUOTE(SpiderAndWeb @ Sat 20th August 2011, 10:31am) *


I disagree with this approach, though. Firstly, there had better be no false positives: if Wikipedia suddenly refused to show me any images because it incorrectly guessed my IP came from a school, I would be... displeased. Secondly, the argument on the talk page that certain governments would force their citizens to turn on irreversible image hiding if that option is available has some ring of truth to it. Lastly, it's not our responsibility to replace incompetent parents and teachers. If a kid is browsing the Internet unsupervised and is determined to see pictures of decapitated Chinese, all the better he do so on Wikipedia and maybe learn a little about the atrocities of war in the process, than on some seedier website.



What makes you think that WP isn't some seedy website?
Milton Roe
QUOTE(SpiderAndWeb @ Sat 20th August 2011, 2:31am) *

QUOTE

Much easier to have any illustration that you wouldn't find in the daily newspaper, hidden behind a tab in its thumb box that does "click to see (warning) graphic illustration." But being a thumbbox, you could still read the caption, to get some idea of what's behind it. The same would go for the parent image file on COMMONS.

This is a good idea, though I'd prefer a solution that doesn't involve editorial judgment squabbles over whether or not the image is "graphic." Maybe a global option to collapse all images into thumbboxes, which you can then choose to view (or not) on a case by case basis after reading the captions?

That goes too far, as most images are clearly non controversial to anybody.

I appreciate that there could be squabbling over what is an image too "graphic" for the average family audience, but we all seem to have a sense general audience by reading newspapers and watching broadcast TV earlier than 11 pm. Anything THEY run, I figure WP should be safe with. And from G vs PG-13 vs. R, from seeing films. WP could simply be G and PG-13.

I'm reminded of Reader's Digest and its standard for maybe-off-color jokes. Many jokes have a sexual content, but in many it is graphic, and others it is implied and you probably need to BE an adult to appreciate it. Pretty young girl: "If I let you come up to my room, do you promise to be good?" Young man: "I promise to be fantastic." The reader's digest has a panel of middled-aged ladies who had to approve all material. If it doesn't get past the matron panel, it didn't get in.
QUOTE(SpiderAndWeb @ Sat 20th August 2011, 2:31am) *

QUOTE
As further step, some IPs are readily identified as school educational link IPs (you see this header on their TALK pages, as a "warning" to blockers of vandalism from these sites). MediaWiki could easily be programmed with a patch that kept such IPs, or any nameuser created from them, from being able to get anything when they attempt to use the "graphic image" mouseclick to see behind the "e-brown paper wrapper." The "unwrap click" would simply fail to work for some users.


I disagree with this approach, though. Firstly, there had better be no false positives: if Wikipedia suddenly refused to show me any images because it incorrectly guessed my IP came from a school, I would be... displeased. Secondly, the argument on the talk page that certain governments would force their citizens to turn on irreversible image hiding if that option is available has some ring of truth to it. Lastly, it's not our responsibility to replace incompetent parents and teachers. If a kid is browsing the Internet unsupervised and is determined to see pictures of decapitated Chinese, all the better he do so on Wikipedia and maybe learn a little about the atrocities of war in the process, than on some seedier website.

Well, you could make the argument (and it has been done) that it's better for kids to learn all about sex (and sexual perversion of all types) at school or from encyclopedias, for the same reason.
Somey
Something interesting happened on the Slashdot discussion of this issue. The original poster (someone called "KiloByte," apparently his first thread-start there) framed the story in a rather openly anti-collapse-box fashion, giving it a title of "Wikipedia May Censor Images" and starting it with the words "To appease 'morality' watchdogs..."

A surprising number of Slashdotters are approaching the subject quite rationally, but pretty much every other post seems to be of the "life is ugly and you should be forced to see it that way, so shut up and stop threatening my porn." (OK, they didn't actually use the word "porn.") You have to page down about 2/5ths of the way before you find someone who really understands the problem and notes that the collapse-boxes will only make little kids want to look more closely at the "forbidden" images than they otherwise might, even while it helps people who might, for some reason, click on a WP page with a porn image on it while they're at work.

All of this is to be expected, given that society now apparently consists of (among many other badly polarized social groupings) ordinary, reasonable, and in some cases, naive folks on the one hand, and a smaller but waaay more vocal and angry group of "never give an inch" anti-censorship militants on the other, many of whom presumably have ulterior motives and agendas.

I mention this because for me at least, seeing the argument on Slashdot is somewhat more instructive than seeing it on Wikipedia, and maybe even (as much as I hate to say it) here on WR, because the anti-censorship people are much less inhibited when it comes to name-calling and other forms of attack against those who are saying, basically, "it seems like this might be a nice feature for some people." You know these take-no-prisoners types are out there, but if you only look at the WP-based discussion of it, you'd almost think they were fairly nice people.
Michaeldsuarez
http://meta.wikimedia.org/w/index.php?titl...9&oldid=2826349

Niabot's tactics never cease to amuse me.
SpiderAndWeb
QUOTE(Somey @ Sun 21st August 2011, 7:20pm) *

...ordinary, reasonable... folks on the one hand, and... vocal and angry group of "never give an inch" anti-censorship militants on the other, many of whom presumably have ulterior motives and agendas.

...you'd almost think [the anti-censorship folks] were fairly nice people.



QUOTE

...the anti-censorship people are much less inhibited when it comes to name-calling and other forms of attack...



:facepalm:

I'm not saying I fully disagree with you, but your points would come off a lot stronger if they weren't served with a side of hypocrisy...
Milton Roe
QUOTE(SpiderAndWeb @ Mon 22nd August 2011, 2:04am) *


:facepalm:

I'm not saying I fully disagree with you, but your points would come off a lot stronger if they weren't served with a side of hypocrisy...

huh.gif And your points also, if they didn't struggle for meaning after burial in litotes and negative subjunctives. wink.gif
Detective
QUOTE(Michaeldsuarez @ Mon 22nd August 2011, 1:17am) *

http://meta.wikimedia.org/w/index.php?titl...9&oldid=2826349

Niabot's tactics never cease to amuse me.

Yes, but he's dealing with our good friend Ottava here. Desperate diseases require desperate measures. tongue.gif
dtobias
QUOTE(Somey @ Sun 21st August 2011, 3:20pm) *

All of this is to be expected, given that society now apparently consists of (among many other badly polarized social groupings) ordinary, reasonable, and in some cases, naive folks on the one hand, and a smaller but waaay more vocal and angry group of "never give an inch" anti-censorship militants on the other, many of whom presumably have ulterior motives and agendas.


That's rather one-sided of you; there are vocal militants on both sides of the culture war over whether there ought to be some sort of "moral policing" in areas like this, as well as more "reasonable" people with nuanced concerns. There are really more than two sides, anyway; there are enormous ideological differences between religious fundamentalists, radical feminists, and the chunk of the WR crowd who's automatically ideologically opposed to whatever Wikipedia happens to have done, but they might end up uneasily on the same side in the issue of censoring/filtering "bad" images.

Additionally, anybody who's not a dogmatic ideologue of some sort is likely to be concerned about any filtering scheme on the grounds of: just whose definition of "offensive" will be used?
carbuncle
QUOTE(dtobias @ Mon 22nd August 2011, 6:30pm) *

Additionally, anybody who's not a dogmatic ideologue of some sort is likely to be concerned about any filtering scheme on the grounds of: just whose definition of "offensive" will be used?

I have come to the conclusion that the goal of this project is to be able to say "we have a tool that lets users block images that they don't want to see". The actual utility of that tool is unimportant to them, so any categorization scheme is as good as any other. I suspect they will probably do something ridiculous like asking Harris and family to sort the existing categories into "block" and "don't block". That way they can keep themselves at arm's length and not have to deal with the uproar.
timbo
The individual item "CLICK TO SEE" wouldn't work since what is apparently worrisome is the prospect of WP being blocked at schools. I suppose they could implement some master switch that makes the CLICK TO SEE buttons not work -- but then trying to ban dicks and titties would have the unintended consequence of banning things such as imagery of the Holocaust and so on.

To my mind, this is an effort to make WP more fully "school friendly" with a secondary mission of allowing certain governments to "culturally sensitively" block imagery and content relating to certain holy prophets, etc.

This is less about the home and more about the school, in my estimation.

tim
Milton Roe
QUOTE(timbo @ Mon 22nd August 2011, 2:00pm) *

The individual item "CLICK TO SEE" wouldn't work since what is apparently worrisome is the prospect of WP being blocked at schools. I suppose they could implement some master switch that makes the CLICK TO SEE buttons not work -- but then trying to ban dicks and titties would have the unintended consequence of banning things such as imagery of the Holocaust and so on.

Well, not really, as teachers presumably have access to the internets at schools under different IPs than the school library computers, or whatever net access class computers have. Students can still see such stuff under adult supervision, then.

However, we're really dealing with two different problems, and the fact that a "click to see" will fix one of them (being surpised with something you wish you hadn't seen, or perhaps wished you'd seen at home and not in front of your boss at work), is not THE SAME as the problem of keeping stuff from being seen by curious minors. The first problem is easy to solve. The second one is basically impossible, since without verifying ID, you can never verify user age. So that said, the unsolvable problem should simply be abandoned as having no solution given present constraints, and thus unworthy of thought and effort. Duh.
carbuncle
QUOTE(Milton Roe @ Mon 22nd August 2011, 9:17pm) *

QUOTE(timbo @ Mon 22nd August 2011, 2:00pm) *

The individual item "CLICK TO SEE" wouldn't work since what is apparently worrisome is the prospect of WP being blocked at schools. I suppose they could implement some master switch that makes the CLICK TO SEE buttons not work -- but then trying to ban dicks and titties would have the unintended consequence of banning things such as imagery of the Holocaust and so on.

Well, not really, as teachers presumably have access to the internets at schools under different ISPs than the school library computers, or whatever net access class computers have. Students can still see such stuff under adult supervision, then.

However, we're really dealing with two different problems, and the fact that a "click to see" will fix one of them (being surpised with something you wish you hadn't seen, or perhaps wished you'd seen at home and not in front of your boss at work), is not THE SAME as the problem of keeping stuff from being seen by curious minors. The first problem is easy to solve. The second one is basically impossible, since without verifying ID, you can never verify user age. So that said, the unsolvable problem should simply be abandoned as having no solution given present constraints, and thus unworthy of thought and effort. Duh.

This isn't really a click-to-see situation, since it will be disabled by default. It is a click-to-not-see-again situation. It isn't clear if they will use cookies to remember the setting on a given computer, but I expect not, because then schools and libraries might find a way to always have it enabled, which is CENSORSHIP!!!!
thekohser
QUOTE(Milton Roe @ Mon 22nd August 2011, 5:17pm) *

Well, not really, as teachers presumably have access to the internets at schools under different IPs than the school library computers, or whatever net access class computers have.


I would have assumed the exact opposite, Milty. And I work for America's most prolific broadband provider.
dogbiscuit
QUOTE(thekohser @ Tue 23rd August 2011, 5:45am) *

QUOTE(Milton Roe @ Mon 22nd August 2011, 5:17pm) *

Well, not really, as teachers presumably have access to the internets at schools under different IPs than the school library computers, or whatever net access class computers have.


I would have assumed the exact opposite, Milty. And I work for America's most prolific broadband provider.

I'd also assume that most of the kids have guessed that the supervisor password is "password" or "admin" by now, or have snuck into the staff room and read the noticeboard.
EricBarbour
No one remembers this poll from 2008?

Also: Carrite, you still suck.
culeaker
QUOTE(Milton Roe @ Mon 22nd August 2011, 10:17pm) *

Well, not really, as teachers presumably have access to the internets at schools under different IPs than the school library computers, or whatever net access class computers have.

Not usually, they don't.
RDH(Ghost In The Machine)
QUOTE(thekohser @ Fri 19th August 2011, 1:35pm) *

I received the following automated e-mail today:

from Wikimedia Referendum, 2011 improve@wikimedia.org
to Thekohser <thekohser@gmail.com>
date Fri, Aug 19, 2011 at 7:15 AM
subject Image filter referendum
mailed-by wikimedia.org

Dear Thekohser,

You are eligible to vote in the image filter referendum, a referendum to gather more input into the development and usage of an opt-in personal image hiding feature. This feature will allow readers to voluntarily screen particular types of images strictly for their own accounts.
etc etc blah blah


I got the same damn piece o Spam.
"Just when I thought I was out...they pull me back in."

It is a very Wililandish attempt to put a voluntary, virtual fig leaf on a big problem that Wikiland cannot cope with, as last year's great pron war on the commons clearly showed.

Obviously the powahs that be, want to avoid an embarrassing replay, so they offer a seemingly simple and reasonable, technological based option, while avoiding the more thorny and horny question of what constitutes encyclopedic porn?

The first, and seemingly easy, part of the answer is developing criteria.
For each image or other piece of material consider:
* Does it serve an educational or illustrative function?
* Does it have historical value or interest?
* Does it have artistic value or merit?

This would be a slam dunk for Michelangelo's David, Botticelli's Venus, and other classical works, since they easily meet all three. Shankbone's wankings, on the otherhand, would have a much tougher time arguing that they meet even one of the above.

But between Michelangelo and Shankers, there is a vast, fun grey area, where some deliberation and discernation are required. Which leads to the really tricky part of the answer, namely- Who applies the criteria and decides?

We know most of the commons admins lack the common sense and judgmental faculties needed to make such calls. And the WP admins and 'crats are arguably even more lacking. So either yet another flawed, bloated wiki bureaucracy would have to be created, or it would be left up to the amorphous, mysterious mob known as the communitah. In which case whoever screams loudest, longest and has the most highly placed friends, will get to decide what is kept or deleted, just like most everything else in Wikiland.
wtf.gif

Now that I think about it, maybe this voluntary, virtual fig leaf isn't such a bad idea afterall.
unsure.gif
KD Tries Again
QUOTE(lilburne @ Sat 20th August 2011, 7:26pm) *

QUOTE(SpiderAndWeb @ Sat 20th August 2011, 10:31am) *


I disagree with this approach, though. Firstly, there had better be no false positives: if Wikipedia suddenly refused to show me any images because it incorrectly guessed my IP came from a school, I would be... displeased. Secondly, the argument on the talk page that certain governments would force their citizens to turn on irreversible image hiding if that option is available has some ring of truth to it. Lastly, it's not our responsibility to replace incompetent parents and teachers. If a kid is browsing the Internet unsupervised and is determined to see pictures of decapitated Chinese, all the better he do so on Wikipedia and maybe learn a little about the atrocities of war in the process, than on some seedier website.



What makes you think that WP isn't some seedy website?


Right. SpiderAndWeb misses the problem. It is one thing to expect adults to steer children away from websites specializing in decapitation. But all roads lead to Wikipedia, and people still don't understand just how seedy and inappropriate Wikipedia is.
KD Tries Again
Wikipedia has been running the risk for some time of being deemed a NSFW, let alone for school, website. I agree this is a patch to reduce that risk.
lilburne
QUOTE(RDH(Ghost In The Machine) @ Tue 23rd August 2011, 4:53pm) *



Obviously the powahs that be, want to avoid an embarrassing replay, so they offer a seemingly simple and reasonable, technological based option, while avoiding the more thorny and horny question of what constitutes encyclopedic porn?

The first, and seemingly easy, part of the answer is developing criteria.
For each image or other piece of material consider:
* Does it serve an educational or illustrative function?
* Does it have historical value or interest?
* Does it have artistic value or merit?

This would be a slam dunk for Michelangelo's David, Botticelli's Venus, and other classical works, since they easily meet all three. Shankbone's wankings, on the otherhand, would have a much tougher time arguing that they meet even one of the above.

But between Michelangelo and Shankers, there is a vast, fun grey area, where some deliberation and discernation are required. Which leads to the really tricky part of the answer, namely- Who applies the criteria and decides?





One shouldn't begin to determine what is or isn't porn. Once you start down that road there are endless arguments. One could simply add bare boobs and arses moderately restricted, genitalia pubic area overly sexualized restricted. Whether it is Porn or not doesn't factor in, you simply base on what is visible.

A second way of filtering would be to crowd source it. You set your comfort level on porn, violence, religious mocking to one a small number. So say I've set my porn tolerance to the midway mark, and see an image that I feel is outside my tolerance level, then I vote it down. If I reveal an image that I think shouldn't have been hidden then I vote it up. Over time the consumers will have flagged the images appropriately.

If I keep seeing images hidden that I think shouldn't be given my comfort level then I can always increase allowance, alternatively if I keep seeing images I don't want to see then I decrease my my allowance.

Michaeldsuarez
QUOTE(lilburne @ Tue 23rd August 2011, 5:03pm) *
One shouldn't begin to determine what is or isn't porn. Once you start down that road there are endless arguments. One could simply add bare boobs and arses moderately restricted, genitalia pubic area overly sexualized restricted. Whether it is Porn or not doesn't factor in, you simply base on what is visible.


http://meta.wikimedia.org/wiki/Talk:Image_...ward_categories

I suggested categories based on concrete concepts and substances, but the thread was hijacked by a flood of comments.
lilburne
QUOTE(Michaeldsuarez @ Tue 23rd August 2011, 11:27pm) *

QUOTE(lilburne @ Tue 23rd August 2011, 5:03pm) *
One shouldn't begin to determine what is or isn't porn. Once you start down that road there are endless arguments. One could simply add bare boobs and arses moderately restricted, genitalia pubic area overly sexualized restricted. Whether it is Porn or not doesn't factor in, you simply base on what is visible.


http://meta.wikimedia.org/wiki/Talk:Image_...ward_categories

I suggested categories based on concrete concepts and substances, but the thread was hijacked by a flood of comments.


Yep well that is all just bullshit designed to derail the discussion. Have nothing to do with it. No one can define in the abstract what is or is not acceptable to everyone. The vast majority of images are uncontroversial to all but the most phobic, and one doesn't tailor a system to the most repressed, one tailors it so that it addresses the concerns of a significant proportion of the target audience. For most that means not cum shots, close ups of cunts, or an exposition of the beast with two backs, and no fists shoved up arseholes. No severed heads, or mutilated corpses, no pictures of people having been beaten up. It ain't fucking hard unless one is a dipshit like Niabot or Levy who don't give a fuck for anyone one else.

RDH(Ghost In The Machine)
QUOTE(lilburne @ Tue 23rd August 2011, 9:03pm) *

A second way of filtering would be to crowd source it. You set your comfort level on porn, violence, religious mocking to one a small number. So say I've set my porn tolerance to the midway mark, and see an image that I feel is outside my tolerance level, then I vote it down. If I reveal an image that I think shouldn't have been hidden then I vote it up. Over time the consumers will have flagged the images appropriately.

If I keep seeing images hidden that I think shouldn't be given my comfort level then I can always increase allowance, alternatively if I keep seeing images I don't want to see then I decrease my my allowance.


Isn't that how the Pron Filter™ is basically supposed to work?
Solve a problem created by crowd sourcing with more crowd sourcing.

Eventually, somewhere down the line, someone will end up deciding which images are appropriate for an encyclopedia-like general reference work. And whomever that may be I hope they have the wisdom , common sense and discretion to...oh who am I kidding, tis Wikiland! It's either the Wild West or a Witch Hunt.
yecch.gif
MZMcBride
QUOTE(Milton Roe @ Fri 19th August 2011, 3:23pm) *
Much easier to have any illustration that you wouldn't find in the daily newspaper, hidden behind a tab in its thumb box that does "click to see (warning) graphic illustration." But being a thumbbox, you could still read the caption, to get some idea of what's behind it. The same would go for the parent image file on COMMONS.
I assume this is referring to newspapers in whatever part of the world you happen to be in? Plenty of newspapers around the world have no issue printing images of nudity or curse words. Images of the dead (read: killed) also aren't uncommon in certain publications. Meanwhile it's rather uncommon to see nudity or curse words in major publications in the United States (almost to the point of absurdity). For a global endeavor like Wikipedia or Wikimedia Commons, I don't see how "have any illustration that you wouldn't find in the daily newspaper" is anything more than a pseudo-solution.
Wikicrusher2
The process of evaluating images as "safe" or "porn" would be met with a considerable amount of difficulty precisely because of the grey area that fig-leafing does not account for. A system in this mold is defective by design, as a metric for "offensiveness" is a subjective matter; as such, it is impractical to try implementing it. It can't result in flawlessness. Not that anything does, but the flaws arising from a system like the one being proposed cannot be resolved.

An image filter that allows one to choose whether or not they would like to see a "potentially offensive" image is not, in itself, repressive. However, it is a wrong-headed approach because it assumes standards for offensiveness or disgust which are either universal or approaching universality. Additionally, it has an Orwellian, restrictive agenda behind it and is the product of a compromise between repression and free culture sharing (which shouldn't be compromised). The sole purpose for applying this moralistic prudery to Wikipedia is because someone wants censorship: whether it is a Wikipedian, schools, or one of the censorware programs that schools use to block access to websites that they deem offensive (or, in the case of one [I forgot which], crowdsourced censorship, wherein websites are placed in certain categories by multiple people and the sites are banned based on the categories in which they fit). Timbo makes a good point that the purpose of this filter proposal is probably to please the schools. However, bending the rules on Wikipedia simply to comply with the petulant desire for control over web access that school administrators have appears to be a compromise that isn't worth making. Schools want authoritarian control over the internet to protect "OMG THE CHILDREN!!!" and the internet's culture is just not compatible with that traditionalist, church-like approach. "We need to teach you some knowledge, but we also need to shield you from the rest". For schools to determine that WP is NSFW and therefore verboten (gesperrt?) unless WP decides to impose censorship (or to compromise with this unworkable mechanism) seems obnoxious.

For one, there should be nothing wrong with letting the images stay, visible to all. Not only is it unfair for anyone to determine what will be offensive to "THE CHILDREN!!!", putting "NAUGHTY, NAUGHTY!!! ohmy.gif " images behind the filter will tempt kids to peek at them even more. WP admins, arbitrators, not even The Community, can be the arbiter of offense for all viewers, and it wouldn't be sensible for them to even try.

Secondly, Orwellian systems aren't necessary for people to determine whether they are repulsed by something. If you believe that you may be shocked by an image of gangrene, and only want a textual, dry, medical description of what gangrene is, then it may just be best to turn images off while browsing Wikipedia. What next, giant disclaimers and airbrushing?

And third, what about the bad image list (or whatever it's called these days), which bans every irrelevant usage of certain images? If you go to a page about the penis, expect dick pics. There is nothing sick or disgusting about that. When it is used out of context, and purely for shock value on some unrelated page, it is already banned. The fact that something of this sort already exists to prevent using "bad" images for shock value proves that this "image filter" is more about a censorship crusade than anything else. It may sound innocuous in theory, but it has a censorious agenda behind it.
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.