Of Course Flickr Has a Porn Problem

Porn has always been a problem for user-generated websites

This article is from the archive of our partner .

It should not come as a surprise that some people are uploading porn to Flickr--a photo-sharing site--and other people are looking at that porn. According to Business Insider, there's some straight-up smut. "There's old men with little girls, hardcore stuff, bondage, and feet everywhere," writes BI's Nicholas Carlson. "Some of it is merely gross. Some of it is plain wrong. We're not linking to it, but trust us, it is there."

Carlson goes on to explain that the dirty pics on Flickr are creating some problems for the photo site's parent company, Yahoo: advertisers are getting upset about their ads showing up next to pictures of naked girls or feet or whatever. Cathryn Weems, the policy and abuse manager for Flickr, explained to Carlson that they "only serve ads on safe content" but it's the users who mark it as unsafe. If it's not marked, an ad could appear. As Weems says, "Every [user-generated content] site … runs up against this problem."

So is Flickr any worse than YouTube or Facebook? Carlson would have you think so. He ends up sort of scolding the company for not better moderating its content, saying the illicit photos "seem like a sign of rot at Yahoo, and Flickr," where they've let "a once-modest porn problem to grow wildly beyond its control."

Keeping abusive content off of user-generated sites is not a simple task. Once allowed to upload files to any site, history has shown that people will stretch the limits of taste quite far. YouTube was overrun with users uploading porn videos until they imposed restrictive guidelines and expanded the team that moderates the videos uploaded. Nevertheless, users were still able to upload porn that would remain on the site, with ads next to them, until users flagged them and the moderation team took them down. (Remember 4chan's YouTube Porn Day?)

Facebook has seen worse problems. Earlier this month, the New York Police Department launched an investigation into a child porn video that started showing up on people's profiles. The link was disguised and had slipped through Facebook's abusive content filters, eventually making it onto 70,000 servers. The incident happened two months after Facebook announced a new partnership with Microsoft that would use sophisticated software called PhotoDNA to identify and block potential pornography. At that time, the amount of child porn on the site was bad enough that some people started the Stop Child Porn on Facebook Campaign to spread awareness and Change.org fired up a petition. We'd bet a silver dollar that Facebook lost an advertiser or two over this issue.

It sounds like Flickr's problems could be much, much worse. The internet can be a dark, dirty place sometimes, and though technological solutions like PhotoDNA can help, it's an immense challenge to keep user-generated content sites clean of abuse. Online advertisers should know this by now.

This article is from the archive of our partner The Wire.
Adam Clark Estes is a former writer for The Wire. He has also written for The Huffington Post and Vice.