Reddit shuts down celebrity nude photo thread, but not for the reasons you think
There are Reddit communities devoted to racism, violence against women, sex with animals, and many other ethically problematic topics. But Reddit says it didn't ban TheFappening because of its content, and it has no intention of changing its absolutist free-speech stand. And under US law, there may not be much critics can do about that. The law gives websites broad immunity from liability for content posted by their users. Almost everything that happens on Reddit is controlled by users, not Reddit employees. The bulk of the action on the site occurs in "subreddits" devoted to particular topics.
These subreddits — there are tens of thousands of them — are created and managed by Reddit users.
Each has one or more moderators who are given the power to customize the look of the subreddit, establish rules for what can be posted, delete content that runs afoul of the rules, and ban users who flout community norms. These norms can vary widely. While most subreddits focus on mainstream, G-rated topics, there's also a fair amount of pornography. And a small minority of subreddits feature extreme content. No, I'm not going to link to them and yes, they're just as appalling as they sound.
Why reddit just banned a community devoted to sharing celebrity nudes
Reddit's management is committed to a broad interpretation of free speech. They believe that it isn't their job to police the morals of Reddit users.
Instead, Reddit believes that each subreddit community has the right and responsibility to establish its own norms. People who feel uncomfortable in a particular subreddit are free to lobby its moderators to change the rules.
[refap] the fappening, or how boobs brought the internet to its knees
If they're still unsatisfied, they can switch to other subreddits or start their own. Reddit believes that each subreddit community has the right and responsibility to establish its own norms. Accordingly Reddit believes that decisions about whether to permit or even encourage racist, misogynistic, or sexually perverted content in a subreddit should be made by the users and moderators of that particular subreddit — not by Reddit employees. The role and responsibility of a government differs from that of a private corporation, in that it exercises restraint in the usage of its powers.
Reddit's management draws an explicit analogy to the First Amendment, which bars the US government from restricting freedom of speech, even speech that's highly offensive.
Mfwtk where to find pictures of the fappening
Reddit touts a similarly absolute commitment to free speech within its own community. TheFappening was a subreddit created for discussion of the celebrity photo leaks, and it quickly became a hub for promoting wider distribution of the photos. Thefappening was like a crack house that required constant monitoring by law enforcement. Reddit avoids censorship as much as possible, but the site does comply with US law. In particular, Reddit will remove content that infringes copyright or constitutes child pornography.
Given that some of the photo subjects were underage at the time they were taken, and many were selfies to which the subject also owned the copyrightlawyers for the victims had legal grounds to demand their removal.
Reddit took 'the fappening' down but its de facto image-hosting site imgur has not
According to a Reddit administratorthings "quickly devolved into a game of whack-a-mole. So according to Reddit management, the problem with TheFappening wasn't that the subreddit had objectionable content, or even that some of that content was potentially illegal. The problem was that users were submitting so much legally questionable content that it was draining Reddit's administrative resources. That's what the critics argue. They point out that there are subreddits devoted to sharing nude images of non-celebrity women that may have been published without their knowledge or consent.
But because these women don't have the resources of Jennifer Lawrence and Kate Upton, they aren't able to generate a blizzard of takedown requests. Reddit critics also accuse the site of being unduly influenced by media attention. It was popular enough to win a "subreddit of the year" vote in It was shut down only after it was the subject of unflattering coverage on CNN.
But other subreddits with equally disturbing content but less media attention remain open for business. While Reddit has taken a relatively passive approach to the fappenning pics reddit stolen nude photos, it's more active in removing posts that "dox" another Reddit user — to expose their real name, address, phoneand other personal information.
So why doesn't Reddit's absolutist free speech policy extend to this kind of information? Reddit says it has had a recurring problem where a Reddit user would be "doxed" and then face anonymous phone calls, unordered pizza deliveries, and other forms of harassment. To prevent this kind of misbehavior, Reddit bans users from posting anyone's personal information. You might think stolen photos are in a similar category — women have obviously faced harassment after nude photographs were posted online. But Reddit says it's too difficult to know whether a photo has been posted with or without the consent of their subjects.
So rather than trying to make a lot of tricky judgment calls, it just allows photos across the board. Here again, critics say, Reddit seems to offer different levels of free speech to different people. The men who are helping to distribute stolen photos enjoy the benefits of Reddit's pro-anonymity guarantees.
But Reddit has refused to do much to help the women in the photos, who have suffered a more severe invasion of their privacy. The content in certain subreddits could raise a wide variety of legal issues, from defamation, to violation of civil rights laws, to copyright infringement. But the law largely immunizes Reddit itself.
Section of the Communications Decency Act gives online companies broad immunity for content posted by their users. There are a few exceptions, notably child pornography and intellectual property.
Another statute, Section of the Digital Millennium Copyright Act, shields sites like Reddit from copyright liability if they respond promptly to takedown requests from copyright holders. Reddit wants to been as a passive provider of infrastructure, but to many people it looks like an absentee landlord. Reddit has hewed closely to these rules. The company removes child pornography whenever it finds it and it complies with requests under the DMCA to take down infringing content. This option isn't always available to the subjects of stolen photos since they only have copyright if they took the picture.
But otherwise, the site is a free-for-all. And they're probably on safe legal ground. Think about Google, for example.
And in operating its search engine, Google follows a policy much like Reddit's: it will take down material when it's legally required to do so. But otherwise it indexes the whole web and let users decide what to search and where to click.
You can use Google to find not only the celebrity photos but a wide variety of other offensive content too. Or consider Comcast. Presumably many of the men scouring the internet for the stolen pictures were Comcast customers who used Comcast's network to locate and download the images.
Yet Comcast not only doesn't try to filter out these pictures, many people believe that broadband providers should be legally prohibited from blocking legal content flowing over its network. But many people feel that the kind of neutrality that characterizes a network owner or search engine isn't appropriate for a company that supports the creation of online communities. Reddit wants to been as a passive provider of infrastructure, but to many people it looks like an absentee landlord who's allowing criminals to operate out of his property.
The Verge's T. Stottek compares Reddit to a failed state that's unwilling or unable to maintain law and order within its borders. They could certainly try. Right now, Reddit does the bare minimum the law requires. It could be more proactive about shuttering communities that promote illegal activities, banning users who post problematic content, and more actively filter posts themselves for racist, misogynistic, or otherwise problematic material. But establishing a consistent and effective system for taking down questionable content could be challenging.
And it's not clear how much good it would do. The seemingly endless campaign against pirated material provides a cautionary tale. For more than a decade, Hollywood and the recording industry have been using their considerable legal and lobbying clout to force intermediaries to take greater responsibility for the infringing content their users distribute.
Content companies have shut down numerous websites that made it too easy to share copyrighted material. Some have even faced criminal penalties. Recording companies sued thousands of individual users who distributed material on peer-to-peer file-sharing networks.
They've pressured credit card companies and ad networks to stop working with sites that promote infringing content. They've persuaded broadband providers to penalize users who repeatedly engage in piracy. Despite all those efforts, which have consumed millions of dollars over more than a decade, copyright-infringing material remains widely available online. Every time an intermediary is shut down, new ones pop up in their place.