Smartphone Apps, Cryptography and Export Controls
Posted by Craig H on 15 January 2012
You can’t work in software product security for as long as I have and not learn something about export controls, like it or not! Historically, many governments regarded encryption as military technology and defined and controlled it as such in their regulations. These days, pretty much anyone who uses the Internet or a mobile phone (and that’s more than 2/3 of the world’s entire population) uses encryption every day, for shopping on the web, logging in to social networks, or simply to call their friends. Nevertheless, export control regulations for encryption are still on the statute books of most countries around the world, and could still be enforced. The UK records of export control prosecutions and fines don’t include any relating to encryption technology in recent years; I would be interested to know if there have been any elsewhere.
Although I have sat in many export control meetings with lawyers over the last twenty-some years, I have to point out that I am not a lawyer, and this is not legal advice. I just thought it might interest others if I share my thinking on the current regimes of export controls, as I’m now in the situation of needing to consider it (again) as we want to publish an Android app that contains cryptographic technology (a simulation of a World War II Enigma machine, more on this soon…)
The main things I’ve learned about export controls on cryptography are that common sense often doesn’t apply and nothing is ever simple.
The first complication is: what counts as export? Especially in these days of cloud computing, national boundaries are effectively invisible; I’m sitting in the UK typing this, and WordPress is automatically saving drafts of the article, and I have absolutely no clue which country it’s storing it in. (IP2Location says the current IP addresses for wordpress.com are in Texas and New York, but that’s just the front end, it could change at any time and their database could be anywhere…) According to the US regulations, the nationality of the recipient also counts, even if they are physically present in the exporting country. I’m going to make my first dangerous leap of logic here, and assume that for the purposes of publishing a smartphone app, what matters is the nationality of the author (where the company is registered, if it’s a company) and the nationality of the distributor (e.g. Google, Apple or Nokia) and every download is a possible export (as there’s no way of knowing the nationality of the downloader).
So, our company is registered in the UK and Google is registered in the US, so we have to worry about UK export regulations and (as Google helpfully points out) US export regulations. The UK regulations presumably govern the upload from my development machine to the Android Marketplace, and the US regulations presumably govern the download from the Android Marketplace to the user’s phone. So far so good.
The second complication is: how can we tell if our app falls under these regulations? 20 years ago it was a simple decision: does this product use any encryption or doesn’t it? Today there are many, many smartphone apps that contain or invoke encryption code; anything that uses the HTTPS protocol to talk to a server, or anything that sends an SMS would do. Clearly the vast majority of these apps aren’t considered export controlled, and the reason is that the regulations now have long and complicated lists of exceptions.
The UK and the US are participants in the Wassenaar Arrangement along with 38 other countries. This defines best practices and guidelines for national export control legislation, and includes a control list with a six-page definition of “Information Security” goods and technologies (Category 5, Part 2, starting on page 83 of the December 2011 version). Section 5.D.2 covers software, so that’s the part we’re interested in. “Note 3″ on page 83 says 5.D.2 does not apply if the product is “generally available to the public…” (and a few other conditions that are clearly satisfied by apps distributed in a public app store, I won’t quote it all). At this point you might think: great! we’re home and dry but…
The third complication is that each country must enact its own regulations based on the Wassenaar recommendations, they all have their own spin on it, and they may use different revisions. A brief history of the Wassenaar control list Category 5, Part 2, follows:
1996: Initial version defining what is controlled, with relatively few exceptions (notable ones are use specifically for authentication or banking).
1998: Adds exceptions for personal export for your own use (“Note 2″) and for items generally available to the public (“Note 3″). However, the note 3 exception specifically doesn’t exempt items using symmetric keys longer than 64 bits.
2000: The note 3 exception no longer depends on the key length.
2009: Introduces a new exception (“Note 4″) relating to the primary purpose of the item. As long as the primary purpose isn’t information security, or providing a platform for other components to do information security, you’re not controlled.
With this in mind, we need to look at the specific regulations in place in the UK and US. The UK equivalent to the Wassenaar Arrangement control list is the Consolidated UK Strategic Export Control Lists and the part defining Information Security Software in the current (August 2010) version is Category 5, Part 2 on page 184. This is copied verbatim from the EU Dual Use List of May 2009 (page 167) and that uses the wording from the 2000 version of the Wassenaar control list. As this includes note 3 without the restriction on key length, our generally available smartphone app is not export controlled. Tick!
Now for the US regulations. The US Commerce Control List was last updated in June 2010, and seems to be based on the wording of the 2009 Wassenaar control list. However, there is a qualification to note 3 stating you “must submit a classification request or encryption registration to BIS” if your generally available item uses symmetric keys longer than 64 bits, etc. (a bit of a throwback to the 1998 version of the Wassenaar list). Of course this doesn’t say anything about Enigma machines, and who knows how long they think the keys for those are? (I was intrigued so I looked in to that here). Luckily, the US regulations do include note 4 from 2009, and that means our Enigma simulator is not export controlled because its primary purpose is not Information Security (it’s education and entertainment). Phew!
So we’re in the clear, but what lesson can be drawn from this? I’d say the chances are that, whatever your app does, if you publish it on an app store it won’t be export controlled; even so, you do need to think about it and don’t just click past the “I comply with US export controls” box. If your app’s purpose is primarily information security (file encryption, encrypted email, that sort of thing) then you may well be required to file with the US Bureau of Industry and Security in order to be compliant.