Meet the anonymous app police fighting bullies and porn on W Source: Carmel DeAmicis
The biggest threat facing new anonymous social networks is their own users. Cyberbullying and sex could fell apps like Whisper, Yik Yak, and Secret, which is why they’re turning to a police force abroad to keep their communities safe. Here’s an inside look.
In an open-floor office in the Philippines, an army of people stare at computers full of Whispers. There are 130 people to be exact, each skimming twenty posts at a time on their PC screens. They click and click and click, flagging Whispers here and there to be deleted or elevated to the company’s San Francisco team.
This is how content moderation works for the world of anonymous apps, and it all happens under the roof of one outsourcing firm called TaskUs. It’s labor intensive and it’s a massive, expensive endeavor. Whisper has used a TaskUs team in the Philippines for two years, since the company’s earliest beginnings, but now it’s not the only anonymous app doing so.
Gigaom has learned that Yik Yak, the lesser known anonymous app beloved by teens, began using TaskUs a few weeks ago and that Secret ― under fire to explain how it will deal with bullies and misanthropes ― negotiated an agreement with the firm to do the same. Secret hopes to use TaskUs, in addition to Metaverse Mod Squad, the San Francisco company it was already employing, to weed out inappropriate or demeaning posts.
Secret was in talks with TaskUs before the recent storm of media criticism ― the company knew it needed to scale its content moderation system and was working on the problem. But the dam broke a little too soon.
There’s a fine line between opinions and bullying
TaskUs manages outsourced staff for 65 startups and tech companies like Expensify, HotelTonight, and Getaround, performing a range of services from receipt transcription to customer service. Now, anonymous apps are relying on the firm to stem the tide of cyberbullying and porn.
Content moderation in the age of anonymous apps is a far trickier game than for social networks of the past. The safety of their communities will make or break Secret, Whisper, and Yik Yak. We’ve seen the likes of other anonymous sites ― like Juicy Campus and Formspring.me ― felled by vicious cyberbullying. Chatroullete infamously went the same way, although its downfall was naked men instead of mean people.
Whisper has been testing, developing, and honing its TaskUs enabled vetting practices for years. Using a full-time team devoted to the job ― instead of freelancers sourced through Crowdflower or Mechanical Turk ― allows it to take a mass approach to content filtering.
Moderators look at Whispers surfaced by both machines and people: Users flag inappropriate posts and algorithms analyze text and images for anything that might have slipped through the cracks. That way, the company is less likely to miss cyberbullying, sex, and suicide messages. Moderators delete the bad stuff, shuffle cyberbullies into a “posts-must-be-approved-before-publishing” category, and stamp suicide Whispers with a “watermark” ― the number for the National Suicide Hotline.
As you might imagine, the man power and operational systems required for that execution are huge. Whisper’s content moderation manual is nearly 30 pages. The standards get into the nitty-gritty, specifying minutia like whether a picture of a man shirtless outside is appropriate, but a selfie shirtless indoors is not.
When the TaskUs team comes across physical threats, it escalates the message to Whisper itself. “If someone posts, ‘I killed her and buried her in the backyard,’ then that’s a piece of content the company will report to the authorities,” TaskUs CEO Bryce Maddock says. “They’re going to pull the UID on your cell phone from Verizon or AT&T and the FBI and local police will show up at your door. It happens quite a bit.”
Cyberbullying is a little tougher to vet. There’s a fine line between opinions and bullying. Christine Nyguyen, who handles content moderation at TaskUs, told me, “We take as a rule of thumb [that] if you were to receive that message from someone and it was hurtful, [then] we’ll remove it.”
Comments about public figures are acceptable, but cruel things about private people are not. TaskUs employees will Google names if they aren’t sure whether someone is famous, frequently relying on Wikipedia for the answer.
How do you scale a human content moderation army?
The cost of doing content moderation like this is staggering for a startup unaccustomed to overhead. “It is the single biggest expense of the company,” a Whisper spokesperson told me. “The content moderation team is three times bigger than any other department.”
Of course, this isn’t a particularly scalable approach. The company can’t add infinite amounts of TaskUs employees the bigger the network grows. Maddock says Whisper is working on developing technical and algorithmic ways to solve the problem. “I don’t know the specifics ― they don’t like to share with me their attempts to hire less human beings,” Maddock said. “But I think they’re making those attempts now.”
Secret is actively working on engineering solutions to use in conjunction with human moderators. The company told me it sees over half a million secrets created per day, with an average of four to five comments per secret. With numbers like that, people are one facet of the solution for vetting posts. The systems have to get more intelligent in auto detecting bad content and bad actors.
A Secret spokesperson told me:
        Secret rapidly hit number one in the app store overall in both Brazil and Israel. Once a dense network comes on it explodes because of the social aspect of the product. That’s an example of one of the challenges of these networks ― you don’t just hire 70 more people a week. Your systems have to be smarter and faster. That’s the Silicon Valley ethos and anonymous apps will require that.
Secret and Yik Yak are far younger than their Whisper counterpart, but the content moderation question is just as relevant for these companies. Both are going through growing pains as their user bases expand, with the Today Show calling Yik Yak “the new home of cyberbullying” and Pando levying a tirade of criticism at Secret for not having enough safeguards in place to prevent the Regina Georges of the world.
Gigaom’s Mathew Ingram followed up to get Secret’s response. “Anonymity is a really powerful thing, and with that power comes great responsibility,” Secret’s CEO David Byttow told Ingram. “Figuring out these issues is the key to our long-term success, but it’s a hard, hard problem and we are doing the best we can.”
Unfortunately, Secret’s best hasn’t been quite good enough. Soon after the media frenzy, Fortune’s Dan Primack ran an experiment to test Secret’s content moderation system. Primack posted a fake “bullying” note: “Sophie R slept with Mr Jacobs after graduation. I’m sure Jared doesn’t know. Slut!” Another colleague flagged the secret as inappropriate, but the company didn’t remove it until days later. In the story, Primack said, “Secret needs to do better. And fast. Perhaps as fast as it’s growing.”
Byttow told me that the company is committed to solving the problem and adding another firm should help, in addition to improving its auto-detect algorithms. He said, “This isn’t new for us, it’s just about continued investment and scaling.”
| }
|