Tuesday, November 12, 2019

Why XHamster Is So Much Better at Content Moderation Than Facebook

Why xHamster Is So Much Better at Content Moderation Than Facebook

Laws hold the porn industry accountable for dangerous content — and it’s thriving nonetheless

Credit: SOPA Images/Getty

InIn In October, Facebook founder Mark Zuckerberg appeared before Congress to testify about his company’s planned cryptocurrency, Libra. Early on in the proceedings, Congressman Patrick McHenry (R-NC) objected to proposals that would rein in the tech giant, comparing them to “red flag laws” that aimed to reduce fear around early automobiles through aggressive measures (including the requirement that a pedestrian waving a red flag of warning precede any car on the road).

To McHenry, and to most of Zuckerberg’s peers in Silicon Valley, the idea of slowing technological innovation through regulation is patently ludicrous. In an industry built on an ethos of “move fast and break things,” the idea that some problems might need to be carefully navigated is utterly foreign — especially for companies whose value is tied to their massive number of users. With hundreds of millions of users (and, at Facebook, billions), it’s often argued that companies could not be reasonably expected to monitor everything that winds up on their platforms.

But there’s another sector of the digital world, whose headquarters are just a few hundred miles from Silicon Valley, where this attitude of unchecked technological innovation and growth at all costs would never fly.

In Los Angeles’ Porn Valley, the adult entertainment capital of the world, companies move significantly slower than their Silicon Valley peers because they are burdened by a combination of legal regulation and social stigma that makes the cost of reckless action potentially devastating. And despite Silicon Valley’s assertions that regulation would utterly derail their ability to make real progress, pornographers still manage to innovate — all while avoiding some of the messy pitfalls that have plagued Big Tech.

Take the difference in how Silicon Valley and Porn Valley handle user-generated content, for instance. On mainstream social media sites, instant posting is viewed as the norm — whether you’re posting a link to a New York Times piece, a personal update, or a racist invective, your thoughts will appear on the site as soon as you share them. Although some links and words do trigger a basic moderation algorithm that prevents the update from being posted, most moderation is done post hoc, often after problematic content is reported by users.

Before anything can be posted to an adult site, it must be rigorously screened.

When companies like Facebook and Twitter have been called out for hosting illegal or abusive content, their response has generally been to acknowledge the problem and promise a fix some time in the future. A recent CNN investigation chronicled the many times that Zuckerberg has acknowledged his platform’s problems and promised to do better, including statements made in January 2018, November 2018, and May of 2019. And yet, as the report notes, content that directly violates the site’s terms of service continues to remain on the platform.

With porn sites, it’s hard to imagine a similar situation, largely because many adult companies take the complete opposite approach to monitoring their content. Before anything can be posted to an adult site, it must be rigorously screened to make sure it’s not opening the site up to legal liability. “We’re a bit different than other social media sites,” says Alex Hawkins, vice president of the porn clip sharing site xHamster. “We don’t allow just anything to be uploaded and immediately appear on the site. Our A.I. reviews the content for some violations or refers it to our team for additional review. We have had a legion of volunteers... who review uploads in exchange for in-site rewards, as well as the health of the community.” That process of review means that it can take several hours — if not longer — for uploads to appear on xHamster, a significantly different model than the instant gratification found somewhere like Facebook or YouTube.

Aggressive content moderation isn’t the only way that xHamster controls what lands on the site. As the terms of service makes clear, chats between users are also periodically monitored to ensure that they’re in compliance with the site’s policies. That may seem extreme, but there’s a good reason: sites like xHamster literally cannot afford to have content that violates their policies appear on their platforms, even momentarily. The penalties imposed by the government, billing agents, and banks — which can include punishments ranging from being banned from processing user payments to being thrown in prison for years — mean that even the slightest slip up could put a porn company permanently out of business, or worse.

There are four major legal regulations that govern how adult companies do business. In addition to laws banning the use of underage performers, there are also obscenity laws, 2257 regulations (a suite of recordkeeping regulations that require companies to maintain extensive paperwork proving anyone appearing on-site is above the age of 18), and the anti-trafficking law FOSTA-SESTA, which harshly penalizes companies that allow sex work solicitation, or anything that might be read as such, to be conducted on their platform.

“Because we’re very aggressive in our patrol of content, the criminals know not to use us.”

In addition to these laws, adult entertainment companies face significant pressures from banks and billing agents, who may boot adult clients from their services at a moment’s notice — even if that client’s work is completely within the bounds of the law. When a single slip up can mean a catastrophic loss of income, you can’t afford to move fast and break things.

The result is that sites like xHamster are very good at preventing illegal content from appearing on their sites. Users of xHamster upload about 7,000 videos a day, and Hawkins estimates that about one in 20,000 of those videos are flagged and blocked before they can appear on the site. “Because we’re very aggressive in our patrol of content, the criminals know not to use us,” he says.

Few members of the porn industry would argue that laws like FOSTA-SESTA, 2257, or obscenity laws — all regulations that are confusingly written, difficult to adhere to, and extremely punitive — are a good thing for the industry, at least in their current formats.

But as Big Tech argues against even the most minimal acts of oversight, it is worth noting that extreme scrutiny placed on the adult industry has not eradicated the industry or made it incapable of innovating. What it has done, however, is make members of the adult industry cautious and considered about enforcing safety measures that Big Tech has long been cavalier about.

If a porn site were the subject of an investigation that revealed it had been home to millions of images of child abuse content, its parent company probably wouldn’t remain in business for much longer. For Facebook, however, a Times piece that alleged an epidemic of child abuse on its platform was just a bad news day. That’s a stark difference in reaction to the same offense, and one that should give us all pause.

It is true that a site like Facebook wouldn’t be able to reach a billion users, or provide those users with the same experience, if it adopted the same content moderation model as a site like xHamster. But given some of the horrors that the existing version of Facebook has unleashed, it’s worth considering whether a version of the site that had focused more on moderation and less on rapid growth might have been better for us all.

After all, if porn can survive — even thrive — in the midst of such a difficult landscape, it seems likely that Silicon Valley would be able to withstand a few “red flag laws” of its own, and that all of us might be better off in a world where the giants of tech were encouraged to slow down, take a beat, and actively take measures to ensure that the content they publicize and promote isn’t actively causing anyone harm.



from Hacker News https://ift.tt/2OcRoFA

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.