TechNews Pictorial PriceGrabber Video Mon Nov 25 17:27:00 2024

0


FBI’s Tor Hack Shows the Risk of Subpoenas to Security Researchers
Source: Andy Greenberg


Computer security researchers who expose hackable vulnerabilities in digital products face plenty of occupational hazards: They can have their work censored by threats of lawsuits from the companies whose products they hack, or they can even be criminally indicted if their white-hat hacking runs afoul of the Computer Fraud and Abuse Act. But one still-mysterious encounter between security researchers and the law points to a newer, equally troubling possibility: They can have their work subpoenaed in a criminal investigation and used as a law enforcement tool.

A judicial ruling released yesterday in the case of Brian Farrell, an alleged staffer of the defunct Dark Web drug site Silk Road 2, confirmed what many who followed that black market’s downfall have suspected for months: That the FBI was able to bypass the anonymity software Tor—the central tool used by the Silk Road 2 and its buyers and sellers to evade the cops—with information they obtained from a subpoena to Tor-focused security researchers at Carnegie Mellon University’s Software Engineering Institute. In a ruling, judge Richard Jones of the Western District of Seattle wrote that Farrell’s IP address was obtained through a subpoena to Carnegie Mellon while the university researchers were running an experiment on the Tor network designed to show how its anonymous users and servers could be identified.

If you’re a researcher, you need to think: Am I going to get subpoenaed here? Should I be gathering this information and risking putting it into the wild? Tor Ekeland

This chain of events should serve as a warning to the computer security research community. It proves that FBI agents somehow learned of research intended to be openly shared with a community that would fix the security flaws it exposed, but instead they subpoenaed it to be used in secret to identify and arrest criminal suspects. And they could do it again.

“When you do experiments on a live network and keep the data, that data is a record that can be subpoenaed,” says Matt Blaze, a computer scientist at the University of Pennsylvania. “As academics, we’re not used to thinking about that. But it can happen, and it did happen.”

That’s an unexpected risk that security researchers—both academic, corporate and independent—need to consider before gathering private data on witting or unwitting subjects, even if they plan to keep that data unpublished or redact it in their public release. That specter of a subpoena, argues computer-security focused defense lawyer Tor Ekeland, could create a “chilling effect,” limiting researchers’ behavior for fear that their test subjects could become subjects of a criminal indictment. “If there’s a criminal investigation, yes, the FBI or the SEC or the DEA can issue an administrative subpoena for your data,” Ekeland says. “If you’re a researcher, you need to think: Am I going to get subpoenaed here? Should I be gathering this information and risking putting it into the wild?”

The FBI’s subpoena could feasibly have even gone beyond private data to include the Carnegie Mellon’s actual Tor-cracking technique, Ekeland argues. “It seems like they’re trying to subpoena surveillance techniques,” he says. “They’re trying to acquire intel gathering methods under the pretext of an individual criminal investigation.”
Purging the Dark Web

Exactly what the Carnegie Mellon researchers handed over to the FBI remains far from clear. But in an abstract on the website of the Black Hat hacker conference, where they planned to present their Tor-focused research in August of 2014, they described it as a serious vulnerability that would allow them to identify both Tor users and web servers that use Tor to hide their location, known as Tor hidden services. “Looking for the IP address of a Tor user? Not a problem. Trying to uncover the location of a Hidden Service? Done. We know, because we tested it, in the wild…” the abstract reads. The researchers promised to “dive into dozens of successful real-world de-anonymization case studies,” including Tor-hidden drug markets and child pornography sites.

Just weeks after that abstract was posted, the talk was mysteriously pulled from the Black Hat conference schedule. And then in November of 2014, the FBI and Europol together launched Operation Onymous, a purge of the dark web that took down dozens of Tor hidden services including the Silk Road and several other top drug markets.

At the time, the law enforcement officials who led that operation boasted that they possessed a new, secret technique for identifying Tor-hidden sites—not merely a list of IP addresses that might have been collected by Carnegie Mellon’s researchers. “This is something we want to keep for ourselves,” the head of Europol’s European Crime Center Troels Oerting told WIRED at the time. “The way we do this, we can’t share with the whole world, because we want to do it again and again and again.”
Similarities to the Apple/FBI Case

That suggests law enforcement could have obtained not only raw data from its subpoena to Carnegie Mellon, but possibly the Tor-hacking technique it then used independently, says Runa Sandvik, a security researcher and former developer for Tor. “They could have subpoenaed all the information the researchers had,” she says, which would include enough about the Carnegie Mellon Tor attack for the FBI to use or replicate it. She says this is comparable to what people are worried the FBI plans to do with the court order to Apple to create new iPhone firmware that would let the government crack encrypted phones. “If the FBI can subpoena a technique, can they reuse it?” she asks. “It sounds similar to the Apple/FBI case. They claim they’re just asking for help with a single iPhone 5c but as soon as the exploit is put together, it can be reused on other iPhones the FBI needs help with.”

But unlike Apple, it’s important to note that Carnegie Mellon researchers may not have been the most resistant targets for a subpoena. After all, they worked for Carnegie Mellon’s Software Engineering Institute, a government-contracted research lab with a separate building located off Carnegie Mellon’s primary campus, and their work was funded by the Department of Defense. Those federal ties may have helped the FBI to learn about the researchers’ results, and may have even led the researchers to willingly give up those results to FBI investigators. On the other hand, the researchers’ aborted plans to present those results at Black Hat and another conference, the Association for Computing Machinery Conference on Computer and Communications Security, suggest that they didn’t originally intend their work to become a secret tool of law enforcement.

Tor’s custodians at the non-profit Tor Project, for their part, had previously accused Carnegie Mellon of not only willingly giving up its research, but being paid for it. Tor’s co-founder Roger Dingledine told WIRED in November of last year that he believed Carnegie Mellon was paid $1 million for its Tor-breaking technique. Carnegie Mellon responded by denying that accusation, and instead giving its first hint that the research had instead been subpoenaed. “In the course of its work, the university from time to time is served with subpoenas requesting information about research it has performed,” a Carnegie Mellon spokesperson wrote in a statement. “The university abides by the rule of law, complies with lawfully issued subpoenas and receives no funding for its compliance.”

When WIRED reached out to the Tor Project, it had no comment on the subpoena issue, and Carnegie Mellon would only refer back to its previous statement. The FBI didn’t immediately return WIRED’s request for comment.

The most solid lesson of this messy affair, argues Blaze, is that researchers need to consider the risk that their work could be subpoenaed, and protect their subjects accordingly. “You have to be aware of your ethical obligations to not expose your subjects to harm,” he says. “Keep data in a form that retains as little information about individuals as possible, anonymize it, destroy it as soon as it’s no longer relevant. And if it’s not absolutely necessary, don’t collect it in the first place.”


}

© 2021 PopYard - Technology for Today!| about us | privacy policy |