Franklin Heath Ltd

Master Your Information Assets

  • Categories

  • Meta

Many Eyes and Security Incentives

Posted by Craig H on 1 April 2009

[sorry this ended up being so long, I couldn’t see a good way to split it into smaller posts!]

I am often asked whether I think that publishing the complete source code to the Symbian Platform will result in more security vulnerabilities being exploited by the “bad guys” (Internet fraudsters, malware writers, software pirates, etc.)

The short answer to that is: No. I’m confident that the advantages of collaborative open source development will more than outweigh any disadvantages of potential attackers getting easier access to the implementation details of the Symbian OS security mechanisms. There is however a longer answer explaining why I think that, which is what I’d like to share in this blog post.

Professor Ross Anderson was probably the first to consider this question with academic rigour, back in June 2002. A later version of his paper appears in the 2005 book Perspectives on Free and Open Source Software (free to download) and the chapter title neatly sums up his essential conclusion: “Open and Closed Systems are Equivalent (That is, in an Ideal World)”.

It’s very important to note the “in an Ideal World” part of that – most of the chapter is devoted to considering ways in which aspects of real projects can vary from that ideal and thus tip the balance in favour of attackers or defenders. Many of these imbalances relate to the relative incentives of attackers and defenders to discover security bugs.

I’ll return to the question of incentives in a moment, but first I want to address the “many eyes” aspectEric Raymond, in his seminal book The Cathedral and the Bazaar wrote “Given enough eyeballs, all bugs are shallow”.  The context is his chapter entitled “Release Early, Release Often” and his point is that bug fixing can be more effective by benefiting from the network effects of a broad open community, rather than relying on a smaller team of dedicated testers.

For security bugs there is a converse of this, however, which is that the source code will also be available to a community of bad guys, who want to find bugs in order to exploit them rather than to fix them.  I think we shouldn’t worry too much about that though, for two reasons:   First, based on the sophistication of exploits we have already seen, we believe that some of the bad guys already have the source code (Symbian OS source code has already been shared with dozens of licensees, and it only takes one discontented employee to have leaked it).  Second, I am sure that, if they have right incentives, we can have many more good guys than bad guys looking at our source code.

So, what can the Symbian Foundation do to make sure the right incentives are in place for our community?  I have my own ideas, which I will lay out here, but I want to emphasise that we are very much open to suggestions.  You, reading this, are part of that community, so we want to hear from you what incentives you think would be effective.  Incentives (for the bad guys or the good guys) could be financial, but given that this is an open source project, non-monetary or social incentives will probably more feasible for us to implement!

Let’s consider the bad guys.  Can the foundation do anything to reduce the incentive of the bad guys to exploit any security bugs that they find?

Thus far, I haven’t seen or had many good ideas on this front.  I will note though that most exploits we have seen thus far don’t seem to have been developed for financial gain.  There has been some malware (using legitimate APIs but with social engineering techniques duping users into installing it) attempting to use premium rate SMS messages to fraudulently charge people (I think RedBrowser was the first malware of this type that we saw) but these have so far proved ineffective.  Many network operators withhold revenue from premium rate numbers for a period of time (30 days in the UK) which should be long enough for the fraud to be discovered.  The network operators will also be able to trace who the revenue is being paid to, which is a deterrent.

The one thing I can see reducing the bad guys’ incentive is for us to make sure that, once a vulnerability is discovered, vulnerable phones are patched as soon as possible, so we minimise the window of time in which the vulnerability can be exploited.  The “us” in that sentence has to be the community, by the way, not just the foundation.  The foundation can provide the tools (bug tracking, source control and configuration management) to make sure a fix is created and committed to the source tree as soon as possible, but it doesn’t have the ability to deliver a binary patch to phones.  There is a lively discussion of how software upgrades should work over on the main Symbian Foundation blog, so I won’t go into that further here.

Now, what about the good guys?  How can we encourage the good guys to find, report and fix security bugs in the Symbian Platform?

The late lamented Professor Roger Needham observed that looking for security bugs isn’t widely regarded as interesting work (except in high-profile security code such as crypto algorithms): “Many security bugs are in out-of the way and unglamorous pieces of code, and unless you’re being paid to look at it why should you.” (from a paper available online.)  Taken at face value, this is undoubtedly true, and yet there are open source projects that have successfully benefited from volunteers finding and fixing security bugs (I consider OpenBSD to be the canonical example).

So, given that finding and fixing security bugs isn’t in itself a rewarding activity, why do people do it? and what can we do to encourage them to do it more?

One reason, I suggest, is a simple social incentive: recognition and approval.  Anyone contributing any bug fixes to the Symbian Platform will get recognition, but I want to make sure that we promptly, publicly and prominently acknowledge the contributors of security bug fixes, and if I have anything to do with it, fast track them to committer status.

Another reason might be the sense of achievement from making a difference to the daily lives of hundreds of millions of ordinary people (I know this one works for me :-))  Over 250 million Symbian-based devices have shipped so far, and that’s just for starters!  Maybe one bug fix only makes a small difference, but that’s a small difference multiplied by a heck of a lot of people.

Lastly (for now, I have to stop typing this post at some point!) there is the possibility of financial rewards for finding security bugs.  We are considering a possible security bug bounty, similar to that offered by the Mozilla Foundation, and I’d be interested to hear what people think about that.  Somewhat controversially, it is also possible to sell “zero day vulnerability” information to companies that specialise in providing security advisories to paying customers, prior to public disclosure of the vulnerability.  One of those companies, Tipping Point, recently sponsored a competition, Pwn2Own, at the CanSecWest security conference; they offered a prize of $10,000 for the first person to break in to any of five smartphones (all with different OSes).  Interestingly, no one succeeded.  I would like to speculate on why, but that can be a topic for another blog post🙂

These are just my ideas, and I’m quite sure I haven’t thought of everything.  Do please let me know what you think in comments to this post – and thanks for reading this far!

2 Responses to “Many Eyes and Security Incentives”

  1. avs said

    From the relying party and reactive viewpoint:

    If you’re using some lesser-known component with only a small community behind it, perhaps just a single developer, you really have no reasonable expectation of an SLA. If at the same time, you are shipping the code out in a gazillion consumer products, you really need to manage this risk and be prepared to fix things yourself if needed. Yes, the same issue comes up with closed source provided by a bankrupt company (perhaps even more problematic due to code escrow costs, if you have no access to source). But the risk of an ”as is” clause really needs to be mitigated.

    Another thing you have to recognise when using OSS is that you have less say on what sort of vulnerability disclosure process there is. The relying party needs to adjust to the OSS project’s embargo periods and disclosure policies. Fair game, yes, but this just needs to be understood when you decide to base your product on such code.

    Anyway, if you look this from the OSS projects’ point of view, effectively it’s a question how much free lunch you want to (and are able to) offer to your relying parties.

  2. Craig H said

    @avs, yes, you raise some good points. As an OSS project, there won’t be any SLA on fixing Symbian Platform bugs (security or otherwise) from the foundation. There will however be an effective bug tracking system, and someone (me!) who can encourage the package owners to prioritise security bugs.

    I agree with you that, worst case, there could be a package with only a single developer (although I’m sure that will be rare, if it happens at all) but there is still at least one identifiable, responsible person for each package, so it’s not as if no one will care.

    For relying parties that really need a more concrete service level than this “best effort”, I think they have two options. One is to set up a contract with a third-party service provider (I expect some of the Symbian Competence Centres may be in this business in future) to deliver bug fixes with an SLA, and the other is to commit some of their own engineers to working on the Symbian Platform. Either way I hope these bug fixes will be tracked in the foundation’s bug tracker, and that fixes make it back into the open source repository.

    On the vulnerability disclosure process, I hold my hands up and admit we don’t have one yet. My thoughts at this point are that security bugs will be tracked in the normal bug tracking tool, but only visible to the owners and committers of the affected package(s). I would be interested to hear others’ thoughts on this.

    Lastly, on “how much free lunch”. That’s an ever-present risk of OSS projects, but, on the foundation’s part, we know that we’re not a charity, and people and companies will only contribute if it’s in their own interests (hence my focusing on incentives in the first place!). On the community’s part, we need to recognise that the foundation doesn’t have any product development engineers. If you sit there and wait for stuff you want to happen, it’s probably not going to, so get stuck in!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

 
%d bloggers like this: