I’m either going to get flagged for porn or disappoint a lot of new readers


Gaze on this erotic image.

The current state of computer detection of pornography is a bit primitive: it keeps mistaking desert photos for images of naked people. If I stare hard at it for a while, I guess I can sort of see it — it’s all those reclining curves, I think.

From this we learn that AI is not only unable to distinguish people from bags of sand, but also that it’s more than a little racist.

Comments

  1. militantagnostic says

    I have emails from my( Pressure Transient Analysis) business flagged as porn at the receiving end because you can’t spell analysis without anal.

  2. hunter says

    I was at a new (for me) coffee shop the other day doing some surfing and discovered that their filter was banning Hullabaloo,a respected liberal political blog, as “porn”.

    I guess to some people, it is.

  3. nomadiq says

    Not to say that AI technology is never racist but this is not an example of AI technology being racist, necessarily.

    Detecting sand color and concluding it’s porn is not an example of the AI thinking only sand (white?) colored people do pornography. That doesn’t logically follow at all. If the AI only failed on “desert sand” color you might conclude the AI was taugh about a particular racial porn – and this would be racist. But failing to see the difference between sand and porn is not racist if amongst the pornographic images the AI was trained on included sand colored people.

    To conclude otherwise is letting your biases get the better of you or really not understanding how AI technology works. Neither will improve AI technology or establish accurately where AI fails and where we need to be cautious with it.

  4. Snarki, child of Loki says

    See, if you had a GIF of the Grand Teton Mountains, it would have been totally okay, right?

  5. methuseus says

    I remember when the censoring software on the Internet at my high school wouldn’t let us get to Webcrawler. Never did figure out what it was classified as, since I don’t remember if it told us.

  6. KG says

    militantagnostic@1,

    The inhabitants and businesses of Scunthorpe, Lincolnshire, have similar problems.

  7. KG says

    Hah! I tried to post a response to #1, pointing out that the inhabitants and businesses of S****thorpe in Lincolnshire had similar problems, and FTB’s filter rejected it! Why beholdest thou the mote that is in thy brother’s eye, but considerest not the beam that is in thine own eye, PZ?

  8. says

    Yeah. If it was just mistaking deserts… But it flags everything from potted plants to artistic works, whether the latter “actually” violates the rules or not. And given the fact that the latest BS censorship AI was only put in place to sanitize a tumblr, which is and artist haven (one of a tiny few), so that it could be sold off to a big corporation, all of whom are, apparently, beholden to the moral minority and scared to death someone might right letters.

    But, the unbelievably stupid thing about it, other than the fact that there seemed to be no interest, previously, on their part to actively pursue detection of.. say, racists, or worse, but half the freaking people using it are artists, who don’t really have any other similar place to post at the moment, and make up a significant part of their user base.

    The minimum being suggested at this point is, one the day of stupidity, log out, and stay logged out, at least 24 hours. The, likely, more effective, but also damaging to all those artists who might still hold on by the skin of their teeth, because the bloody AI hasn’t accidentally flagged them as posting adult pictures, is, “Leave them without the user base they are trying to sell out, and claim represent the value of the service.”, i.e., delete your account with them, before they find some AI driven excuse to delete it themselves.

    I mean, seriously, one of the posts “recently” flagged by this absurd system was, if I remember, a photo of starving kids, as part of one of those “feed the hungry” things, where it “decided” there was too much skin showing (not white, thankfully, just too much skin), and thus “had to be” porn. How many years of this nonsense have we gone through, which places like search engines accidentally flagging everything from women’s health sites, to foreign aid initiatives, to, heck, another one of the recent one was a bloody “box” with a super hero on it (must have been flagged as body painting or something???), and they still haven’t grasped the concept, “This shit doesn’t work!”

  9. whywhywhy says

    Do you think the AI’s kink is related to silicon being a key component of both computer chips and sand? Kind of getting back to the womb type of thing or not wanting to see your parents naked?

  10. kudayta says

    It took me about 5 seconds to see that image as a naked person. There’s a dead pixel on my monitor, and it was right in the top-middle part of the darker sand dune. Made it appear as a bellybutton.

    I should probably see a therapist.

  11. weylguy says

    That image is disgusting and should be taken down immediately! Oh, I know that some SCOTUS justice once said that pornography is in the eyes of the beholder, but this blatant sickness has just got to stop!

    Shame on you, Myers! I thought your photos of buck-naked spiders were over the top, but this is really too much!

  12. brucej says

    Our new antivirus/web blocking software has, for some reason, blocked my dog’s vet practice as a ‘Adult/Sexual site’. I have no idea why…

  13. madtom1999 says

    @13 this shit does work. In the sense that it can pick out stuff with close to human performance with significant speed increases and massive cost reductions. Humans make mistakes too – and they can also get seriously fucked up from looking at some images so I quite like the idea of machines doing the first parsing of the job. Over time, if there is a reasonable error handling process and people report back problems with images like this so the AI model can be reassessed and re-trained to further reduce things being rejected that we find silly.
    They are still shit – but putting humans in charge of censorship would mean potential to damage them. And humans are already bigoted and mean and nasty and can cause as much damage as an AI with ease.

  14. davem says

    Someone’s been feeding the A.I. program with The English Patient . In the film, there’s lots of fades from sleeping females to desert landscapes with the same contours.

Leave a Reply