Proposed Tech-Export Rules Bashed by Companies, Researchers
Highlights challenges of regulating intrusion, surveillance technologies
Amid the Arab Spring upheavals five years ago, human rights advocates became concerned about electronic surveillance technologies and hacking tools that governments across North Africa and the Middle East were using to monitor and silence dissidents.
News reports detailed how equipment and technology from Western companies such as U.S.-based Hewlett-Packard and NetApp Inc., Sweden’s Ericsson AB, United Kingdom’s Creativity Software Ltd., Italy’s Hacking Team, Germany’s FinFisher and France’s Vupen had found their way to some of the region’s most repressive regimes.
As pressure from human rights advocates mounted, the European Union in 2011 curbed exports of intrusion software and surveillance technology. The following spring, President Barack Obama authorized sanctions against those providing technology to Iran and Syria that were being used to suppress dissidents.
But by 2013, officials in the U.K. and France went further and sought to broadly restrict exports of such technologies by relying on conventional arms control measures, which led later that year to 41 nations, including the U.S., signing the so-called Wassenaar Arrangement.
While the agreement’s intentions were good, proposed rules intended to implement the restrictions in the U.S. last year have drawn howls of protest from cybersecurity companies, researchers and universities, who say the limits would hamstring critical global communications and collaboration among researchers and hinder the development of better defenses against cyberattacks.
In the U.S., both the State Department, which helped negotiate the international Wassenaar rules, and the Commerce Department, which wrote draft U.S. regulations to comply with them, have decided to renegotiate the restrictions at the urging of lawmakers. The revised U.S. proposal is due out this fall, and along with other countries’ proposed regulations, it will be presented for review later this year as nations reconsider the international agreement.
The debate highlights the challenges facing lawmakers, government officials and policy experts as they grapple with how to regulate complex new technologies such as intrusion and surveillance software and hardware.
Intrusion software includes a range of technologies designed to secretly intercept a person’s activities and communications on computers and smartphones, and gather information such as passwords, screenshots, audio recordings, photos and the content of chats using apps like Skype. The technologies may also be used to execute commands remotely. Network surveillance tools are used to track the internet usage details and patterns of a large number of users as well as information about them.
Although such technologies can be used for malicious or offensive purposes, efforts to curb their exports suggests that the regulators didn’t understand the nature of the computer security business, critics say.
Unlike embargoes and sanctions, which prohibit dealing with specific countries or individuals, the proposed restrictions would have forced even individual researchers working on computer security to obtain licenses, they say.
The technologies the Wassenaar agreement tried to restrict “certainly can be used for bad purposes, but cybersecurity tools used by malicious hackers are also used for good purposes by technology companies and developers,” says John Miller, vice president for global cybersecurity and privacy policy at the Information Technology Industry Council, a Washington-based group that represents technology companies. “Export control law usually doesn’t get into making distinctions on what the technology is going to be used for.”
And that’s “one of the reasons it’s difficult to regulate this technology,” Miller says.
While human rights concerns propelled countries to clamp down on technology exports, developers of such software have been able to flout the rules even in countries that already implemented the Wassenaar restrictions, says Eva Galperin, international policy analyst at the Electronic Frontier Foundation, a San Francisco-based nonprofit group that advocates for civil liberties online.
Italy implemented the Wassenaar rules with the goal of stopping Milan-based Hacking Team from selling its intrusion software to repressive regimes. But that “never slowed them down,” Galperin says, citing internal documents obtained by Wikileaks that she says showed the company continuing freely to sell the technology.
Also, repressive regimes typically use monitoring software to track dissidents living outside their borders rather than those inside because such governments often control the telecom network and can easily observe activists without having to break into their computers or smart phones, Galperin says.
Human rights watchers are better off using other means to bring pressure on repressive regimes than trying to curb technology exports, which places greater restrictions on organizations like the Electronic Frontier Foundation and researchers like herself, Galperin says.
Cybersecurity is strengthened when governments, companies and researchers try to penetrate their own computer network defenses and fix holes before an attacker finds and exploits them. Large companies hire so-called “white hat” hackers to find bugs.
Researchers and these hackers routinely collaborate across the globe, to share the technology, test the networks and try to quickly minimize damage when a new vulnerability is exposed. But if a researcher is in a country covered by the Wassenaar agreement, a license would first have to be obtained before any sharing. An exporter could get a license valid for four years by specifying the countries, technologies and intended uses, according to the Commerce Department. Without the details a license may be needed for each transaction.
An overly broad interpretation of the restrictions could hurt legitimate development of tools that are designed to stop or ferret out intrusion software, many respondents told the Commerce Department in reacting to the proposal after it was made public a year ago.
“Unfettered access to mass market and publicly available cybersecurity tools is critical to ensuring that security researchers and practitioners can adequately test systems and harden them to defend against malicious intrusion,” a team of experts from the Center for Democracy and Technology, the Electronic Frontier Foundation, Human Rights Watch and others wrote.
The ability of researchers and professionals to freely share details of attacks and system weaknesses is critical for making fixes, they added.
Only by using what appear to be dangerous and subversive tools can security experts make sure that a computer network is safe and not vulnerable to attack by malicious actors, says Joseph Lorenzo Hall, chief technologist at the Center for Democracy and Technology, the Washington-based nonprofit that advocates for an open internet.
Unlike the world of physical weapons — say a missile — where one’s own defenses are not shot down or blown up to gauge the effectiveness of the weapon, “in the realm of computer security, tools that we use to test our defenses also are offensive tools in many cases,” Hall says.
Obtaining export licenses can also take many months. Since many computer security researchers are just one- or two-person shops, “they are not going to go through the process to get the export license,” in order to share information with each other, says Iain Mulholland, vice president of engineering trust and assurance at VMware Inc., the Palo Alto, Calif.-based computer security firm. “Frankly they are just going to stop sharing this information,” and “it will dry up the flow” of important security vulnerability reports, he says.
The rules could even ensnare professors at U.S. universities teaching computer security, says Hall. If classes about malware include international students, he believes that having those students find weaknesses in a computer network could be considered an export and require a license.
Sergey Bratus, an associate professor of computer science at Dartmouth College, wrote to the Commerce Department to say that the proposed rules would have hurt his ability to work with graduate students from other countries on projects with U.S. industry involving research into weaknesses in their computer networks.
A better alternative to regulating technologies is to “control sales to specific individuals and nation states that have a history of human rights violations,” Hall says. “Rather than define tools, define the buyers.” Syria and North Korea, for example, already face U.S. trade embargoes and international sanctions.
Unlike conventional arms control measures where regulations on global trade are imposed before the weapon or technology in question has gotten into widespread use, the Wassenaar rules impose a control on something that’s already in commerce, circulating in large volume, by an industry that’s not subject to excessive export control, says Barry Hurewitz, a partner at the WilmerHale law firm in Washington and a specialist on export control regulations.
“I’m hard pressed to point to another modern example” that is similar.
The proposed rules seem to be trying to put the “genie back into the bottle,” says Hurewitz.
When it comes to cybersecurity, export limits are also more problematic because there’s no clear line between defensive systems used for self-protection and offensive systems that can be misused.
The State Department has decided to go back to the Wassenaar group and try to renegotiate the terms of the agreement.
When they agreed to the Wassenaar rules, U.S. officials failed to anticipate the complexity of the regulations for thousands of small and big companies, researchers and universities, Hall says.
U.S. officials who negotiated “the rules didn’t understand the extent to which the tools they sought to control were used in ways that protect things, rather than attack things … and that the method they chose to implement the rules in regulation were particularly onerous for software makers,” he says.
State Department officials declined to comment on the record about the rules.
Unlike smaller European countries that are home to many fewer computer security firms, implementing restrictions in the United States proved to be too cumbersome because of the number, variety and size of the technology industry, say others.
As Mulholland, of VMware, puts it: “Let’s be really blunt, there is one of those 41 countries that has a truly global technology industry of massive economic scale.” That’s the United States, so “the consequences for U.S. companies are much more substantial than for non-U.S. companies.”
“Many of us, frankly, in the community hadn’t been paying attention to export controls,” he says. “We will be paying more attention to it now.”
Alisha Green contributed to this report.