Many Eyes and Security Incentives
Posted by Craig H on 1 April 2009
[sorry this ended up being so long, I couldn’t see a good way to split it into smaller posts!]
I am often asked whether I think that publishing the complete source code to the Symbian Platform will result in more security vulnerabilities being exploited by the “bad guys” (Internet fraudsters, malware writers, software pirates, etc.)
The short answer to that is: No. I’m confident that the advantages of collaborative open source development will more than outweigh any disadvantages of potential attackers getting easier access to the implementation details of the Symbian OS security mechanisms. There is however a longer answer explaining why I think that, which is what I’d like to share in this blog post.
Professor Ross Anderson was probably the first to consider this question with academic rigour, back in June 2002. A later version of his paper appears in the 2005 book Perspectives on Free and Open Source Software (free to download) and the chapter title neatly sums up his essential conclusion: “Open and Closed Systems are Equivalent (That is, in an Ideal World)”.
It’s very important to note the “in an Ideal World” part of that – most of the chapter is devoted to considering ways in which aspects of real projects can vary from that ideal and thus tip the balance in favour of attackers or defenders. Many of these imbalances relate to the relative incentives of attackers and defenders to discover security bugs.
I’ll return to the question of incentives in a moment, but first I want to address the “many eyes” aspect. Eric Raymond, in his seminal book The Cathedral and the Bazaar wrote “Given enough eyeballs, all bugs are shallow”. The context is his chapter entitled “Release Early, Release Often” and his point is that bug fixing can be more effective by benefiting from the network effects of a broad open community, rather than relying on a smaller team of dedicated testers.
For security bugs there is a converse of this, however, which is that the source code will also be available to a community of bad guys, who want to find bugs in order to exploit them rather than to fix them. I think we shouldn’t worry too much about that though, for two reasons: First, based on the sophistication of exploits we have already seen, we believe that some of the bad guys already have the source code (Symbian OS source code has already been shared with dozens of licensees, and it only takes one discontented employee to have leaked it). Second, I am sure that, if they have right incentives, we can have many more good guys than bad guys looking at our source code.
So, what can the Symbian Foundation do to make sure the right incentives are in place for our community? I have my own ideas, which I will lay out here, but I want to emphasise that we are very much open to suggestions. You, reading this, are part of that community, so we want to hear from you what incentives you think would be effective. Incentives (for the bad guys or the good guys) could be financial, but given that this is an open source project, non-monetary or social incentives will probably more feasible for us to implement!
Let’s consider the bad guys. Can the foundation do anything to reduce the incentive of the bad guys to exploit any security bugs that they find?
Thus far, I haven’t seen or had many good ideas on this front. I will note though that most exploits we have seen thus far don’t seem to have been developed for financial gain. There has been some malware (using legitimate APIs but with social engineering techniques duping users into installing it) attempting to use premium rate SMS messages to fraudulently charge people (I think RedBrowser was the first malware of this type that we saw) but these have so far proved ineffective. Many network operators withhold revenue from premium rate numbers for a period of time (30 days in the UK) which should be long enough for the fraud to be discovered. The network operators will also be able to trace who the revenue is being paid to, which is a deterrent.
The one thing I can see reducing the bad guys’ incentive is for us to make sure that, once a vulnerability is discovered, vulnerable phones are patched as soon as possible, so we minimise the window of time in which the vulnerability can be exploited. The “us” in that sentence has to be the community, by the way, not just the foundation. The foundation can provide the tools (bug tracking, source control and configuration management) to make sure a fix is created and committed to the source tree as soon as possible, but it doesn’t have the ability to deliver a binary patch to phones. There is a lively discussion of how software upgrades should work over on the main Symbian Foundation blog, so I won’t go into that further here.
Now, what about the good guys? How can we encourage the good guys to find, report and fix security bugs in the Symbian Platform?
The late lamented Professor Roger Needham observed that looking for security bugs isn’t widely regarded as interesting work (except in high-profile security code such as crypto algorithms): “Many security bugs are in out-of the way and unglamorous pieces of code, and unless you’re being paid to look at it why should you.” (from a paper available online.) Taken at face value, this is undoubtedly true, and yet there are open source projects that have successfully benefited from volunteers finding and fixing security bugs (I consider OpenBSD to be the canonical example).
So, given that finding and fixing security bugs isn’t in itself a rewarding activity, why do people do it? and what can we do to encourage them to do it more?
One reason, I suggest, is a simple social incentive: recognition and approval. Anyone contributing any bug fixes to the Symbian Platform will get recognition, but I want to make sure that we promptly, publicly and prominently acknowledge the contributors of security bug fixes, and if I have anything to do with it, fast track them to committer status.
Another reason might be the sense of achievement from making a difference to the daily lives of hundreds of millions of ordinary people (I know this one works for me :-)) Over 250 million Symbian-based devices have shipped so far, and that’s just for starters! Maybe one bug fix only makes a small difference, but that’s a small difference multiplied by a heck of a lot of people.
Lastly (for now, I have to stop typing this post at some point!) there is the possibility of financial rewards for finding security bugs. We are considering a possible security bug bounty, similar to that offered by the Mozilla Foundation, and I’d be interested to hear what people think about that. Somewhat controversially, it is also possible to sell “zero day vulnerability” information to companies that specialise in providing security advisories to paying customers, prior to public disclosure of the vulnerability. One of those companies, Tipping Point, recently sponsored a competition, Pwn2Own, at the CanSecWest security conference; they offered a prize of $10,000 for the first person to break in to any of five smartphones (all with different OSes). Interestingly, no one succeeded. I would like to speculate on why, but that can be a topic for another blog post 🙂
These are just my ideas, and I’m quite sure I haven’t thought of everything. Do please let me know what you think in comments to this post – and thanks for reading this far!