A couple of items I'd like to add: (I'm Founder/CEO of a busy infosec biz)
While infosec is currently a smaller sector in startups than say social, casual gaming, apps, etc, it's growing furiously and Wassenaar and individual country regs will be top-of-mind for much of the YC community in the years to follow because many of you will be in this space.
I'd like to emphasize one of Google's points: "Global companies should be able to share information globally." With existing laws we are already running into limits on where we can hire and what our own internal staff can send us. So it really is critical for any small or large global org that internal comms are not squashed by this.
BIS's comment period ends today, so if you want to take action, now is the time. As in, before COB today.
One last thing: The Hacking Team compromise and stolen data (including the zero-days they were hoarding) couldn't come at a worse time. It is fuel for the argument that zero-days, vulnerabilities, vectors etc should be tightly regulated and it's really thrown some weight behind the argument to close borders (WRT info exchange) rather than open them. So your help is really needed on this if you think you should be able to have open conversations about technical infosec issues with your colleagues in other countries.
> "Global companies should be able to share information globally."
This is an interesting point, and to me it seems to be pitting global companies against individual countries. This makes me feel uneasy because I feel there's a better chance my government is looking out for my best interests than many global companies are. Am I just being paranoid and naive?
That's a really big debate: small vs big government, Keynesian vs Hayekian economics, public vs private sector and so on. I'd say from my side that government and private sector have a role in protecting you.
Government tends to excel at things like air traffic control and lighthouses where there isn't much scope for competition and where competition may harm. Private sector tends to do things more efficiently when competition is feasible.
So both have a role and the role of private sector is an important one. If we pass laws that restrict innovation, the risk is that things that would normally be in private sector will become government roles and won't be done as efficiently or as effectively as they could be.
I'd say at the rate we're seeing attacks escalate, providing the most effective protection we can in infospace is very important. I'm completely ignoring issues like investment opportunities, job creation and so on because I think the core problem we're all trying to solve is to protect individuals and businesses. We need to get that right first and move from there.
It doesn't have to be about protection. Just about serving your interests. When my country privatized the phone system, the time to get a phone line installed dropped from 3 months to 5 days (because suddenly they were motivated to get it installed as soon as possible so they could start making money).
Welcome to the real world, Neo. If you dig deeper, you will probably find shady Bahamas funds backing companies like that, and one may get an enlightenment, that government of one unspecified country feeds the company in question with zero-days to make a careful provocation to become able to receive public mandate on building tighter regulations for some future markets. Uhh...
Also decriminalisation of software security research public image must happen and it needs some serous PR effort backed by a solid industry association or stakeholder. Otherwise big fish will PR-down all the small fish into regulation network woven by itself.
i'm not sure the hacking team incident is the worst thing possible. hacking team was following the rules. they had export licenses. it just shows that the rules don't hurt bad players.
If you choose to comment; it's helpful to understand the current law and how it works. This way it's easier to know what to ask for.
2) If you use software that may be controlled (for example: pen testing software); this issue affects you. Here are a few suggestions of things you could put in your comment:
3) As of last night, there were 101 public comments posted. Most were far below the quality you would hope for a good discussion on HN, let alone a note to a policy maker to request a change. If you have an interest in this area and have something constructive to suggest--the public comment process is your opportunity to do it.
>You should never need a license when you report a bug to get it fixed
That hampers people that wish to publicly disclose or sell vulnerability information. This is massively biased in favour of software companies (to some extent, like Google). You should never need a license to disclose vulnerability information, full stop.
>Global companies should be able to share information globally
Why should this be limited to the employees of a company?
(The FAQ also states that information about vulnerabilities, as opposed to how to exploit them, would not be controlled, but I believe Google if they say the legalese is insufficient to establish this.)
I agree that defining boundaries rigidly in terms of companies would be limiting in today's world and especially in infosec.
In general, though, I personally really despise the practice of selling vulnerabilities for the purpose of enabling people to attack others with them - which in practice means selling them to anyone but the vendor, or intermediary organizations like ZDI. True, there are so many ways for this to go wrong... but I cannot join with some of the infosec people who blast any regulations on the industry as inherently harmful, infringements of freedom of speech, useless against the real bad guys, etc. Even as I hesitate to even think in terms of things like 'increased threats' or 'acceptable infringement', or oppose 'absolutist thinking', considering how harmful such ideology has been in other, quite different but analogous realms (surveillance, airport security), and while I have little faith in the ability of a government so hyped up about "cyber" threats to avoid serious blunders, I simply cannot bring myself to find the current almost total lack of regulation in infosec, which you hint at in saying a license should never be required to share information, acceptable.
There is a very simple reason why "regulation" in this space will never do anything useful. The cost of entry is very low (an individual can find vulnerabilities without any specialized infrastructure or organizational backing), and the value of vulnerabilities is high. It's basically the war on drugs if drugs could be transferred over the internet. Worse, the majority of the offenders are not in your jurisdiction to begin with, and they never have been or will be.
It doesn't matter what law you pass, you will not stop this from happening. But just because passing laws can't do any good doesn't mean it can't do any bad. Bad laws can still do plenty of harm to the good guys.
The best thing the government could do in this context is to be the highest bidder and then immediately disclose the vulnerabilities to the vendors.
As much as I would like the government to make such bids, I don't agree regulation is useless. Sure, no policy can completely prevent zero days from being sold - in fact, this particular policy doesn't even try; it just limits who you can sell them to. But if that means that organizations and individuals who wish to remain respectable and avoid any trouble with the law, however unlikely it is to be enforceable in practice, limit their trade to quite nefarious actors rather than extremely nefarious ones... it's better than nothing.
edit: that is, it's better than nothing if it avoids harming the good guys too much, and as I said, I am skeptical of many of the critical comments that have been made, though, buying Google's, I hope the rule will be amended. Argh, I'm too tired to express myself properly.
> that is, it's better than nothing if it avoids harming the good guys too much
In theory there is an ideal rule with ideal enforcement that will cause less trouble than it prevents. But as Yogi Berra once said, in theory there is no difference between theory and practice; in practice there is.
Here's a example of a serious problem this actually causes. Suppose Nefaristan is on the list of places nobody can sell to. The evil government of Nefaristan will just send an operative to Jordan or Saudi Arabia or whatever nominally less nefarious place didn't make the list, and buy their exploits there. So either way the evil government of Nefaristan will have embargoed exploits to use against against their domestic dissidents. The dissidents need the embargoed patch right away or they'll be found out and executed. But now the stupid law prohibits anyone from giving it to them because they're in Nefaristan.
It's difficult to imagine how a law could fail harder than "helps bad guys send good guys to death camps" -- but here we are.
Causing serious harm is not better than doing nothing.
Having read the definitions of what is controlled in the proposed rule, I'm pretty confident a patch wouldn't come close. And in any case, since the rules don't apply to public software, that only matters in the case of private patches, which aren't really a thing, and would be a pretty big moral hazard if they were.
Most patches inherently reveal the vulnerability they fix. Patches not being controlled would be a loophole big enough to fit a whole planet through.
And private patches are a thing. Vendors often distribute an early version of the patch to major customers for validation testing.
Or if you like, substitute "patch" for vulnerability information that enables a workaround. You can defeat Heartbleed by turning off TLS heartbeat support but that information is enough to quickly reverse engineer the vulnerability.
The actual proposal is dozens of pages of legalese that would take a team of lawyers a week to decipher. I have no idea what it says because it is totally incomprehensible.
That's half the problem. If you're AT&T or Google you can hire said team of lawyers to tell you what it says, but what is an individual graduate student or security consultant supposed to do?
The other half of the problem is that what it says doesn't change the outcome, because the insolubility of the issue comes from economics rather than policy. There is no policy that will keep vulnerability information out of the hands of the bad guys only, because there is no practical way for most people to even identify who the bad guys are.
> There is a very simple reason why "regulation" in this space will never do anything useful. The cost of entry is very low (an individual can find vulnerabilities without any specialized infrastructure or organizational backing), and the value of vulnerabilities is high.
The cost of entry is not low though. To find a bug is one thing, to build a functioning exploit and associated payloads to weaponise it takes a team of engineers.
At the very least, we can prevent the likes of hacking team and gamma group from operating legally in western countries.
> The best thing the government could do in this context is to be the highest bidder and then immediately disclose the vulnerabilities to the vendors.
> The cost of entry is not low though. To find a bug is one thing, to build a functioning exploit and associated payloads to weaponise it takes a team of engineers.
"Team of engineers" is a bit of an overstatement. It's well within the capacity of an individual engineer. And the payload doesn't really change based on the exploit anyway; different RCE vulnerabilities are essentially fungible.
> At the very least, we can prevent the likes of hacking team and gamma group from operating legally in western countries.
How is that even useful? If it's going to happen anyway then you want it to happen in the open so you at least know what is happening. Push it underground or into places like Russia where you have limited visibility and it only makes it harder to catch the real bad guys.
> We can only dream.
I don't understand why they aren't already doing that. It's essentially a publicly-funded bug bounty program. The only disadvantage at all is that it costs money, and that's a pretty dumb excuse if this is half the problem they're making it out to be.
It could, though. Right now, we in the US pro-gun camp are fighting some newly proposed ITAR regulations which, while probably aimed at the 3D/80% AR-15 receiver camps, would criminalize unlicensed disclosure of the sorts of basic data about weapons that's been freely published since forever.
See e.g. this http://weaponsman.com/?p=23824 Note the relatively innocuous data discussed (inner and outer machine gun barrel heating), and then skip to the bottom "A Note To Our Readers", these guys have consulted their lawyer on the matter. And the NRA (lobby for gun owners) take on it, which seems to be accurate: https://www.nraila.org/articles/20150605/stop-obamas-planned...
I find the idea that I can write a weapon on my computer laughable and I can't see how the hacker community can justify attacks on freedom of speech. This is pre-emptive censorship and as such self evidently entirely morally wrong.
In fact, this is a point I am curious about. IANAL, but the case of Bernstein v. United States, while mooted before it could come to a true conclusion, seems to demonstrate that there is legal plausibility to the argument that publishing source code of "dangerous" systems is protected by the First Amendment. While cryptography is less immediately harmful than some of the things prohibited in this case, the circumstances still seem pretty similar to me; and if these rules were overturned on such grounds, I cannot say I would be all that upset, as such a decision would likely also imply protection for a lot of stuff I like more than this.
That said, when you consider things like Stuxnet, which physically destroyed industrial facilities, I'd say the idea that malware can be a weapon is harder to dismiss these days than in teh past. Admittedly, most zero-day exploits do not have so close an analogy, but in the wrong hands they can certainly help put people in physical danger.
In any case, surveillance is hardly something the 'hacker community' is thrilled about - as per the zeitgeist in this forum, at least...
But Stuxnet harmed nobody and nothing until it was hooked into physical hardware. This is my main point - I can't actually change anything in the real world, unless I am controlling some hardware, but then the situation is the same as if I was operating it directly.
I agree with you that surveillance is unpopular among hackers, but I am old enough to remember when censorship was considered at least as bad I would like it to stay that way.
Yes, we need a war on warez to go with our wars on terror and drugs, which have been rousing successive, having driven both of those social ills to extinction.
The war is on and we are losing. Yesterday a 56 year old bricky from Watford called my in panic. His old XP laptop was literally taken as hostage asking to call some paid phone number to unlock computer back again.
My point, since you missed it, is the instrumenting a "war on warez" in the style of "war on drugs" or "war on terror" is a joke, and should be disregarded as just the talking point of someone with a different idea trying to sell you something.
Are these governments completely insane? Not allowing security research or even reporting bugs without getting a license is the stupidest thing I've heard in my lifetime. The internet cannot be regulated in this fashion without destroying it completely. The world is not a collection of islands we are all in this together and letting any government stand in the way of safety and security is insane.
On top of all that it seems very likely that such licenses will be used to excuse what should be crimes against humanity, in the form of creating exploits and related software to enable hunting down dissidents.
I don't believe this was intentional. Wassenaar is an arms control agreement - the intention was probably to regulate the sales of weaponised exploits, not basic research. It's an overreach.
The problem I have with this sort of arrangement is that the more rules they put in place on lawful disclosure, the more they hamper the good guys who follow the laws and the more of a comparative advantage that gives to the bad guys who can simply ignore the laws.
We really don't want to hand the bad guys any more advantages than they already have, no matter how good our intentions are.
if anyone wants some more background on all the negative side-effects of the current regulation, I wrote a lengthy blog post on the problems with the current phrasing of the Wassenaar amendments here:
(Background: I am a security researcher who designed industry-standard patch analysis algorithms / software, built algorithms & a startup for malware reverse engineering that was acquired by Google, pioneered several exploitation techniques, and worked heavily on the recent Rowhammer vulnerability)
> Global companies should be able to share information globally. If we have information about intrusion software, we should be able to share that with our engineers, no matter where they physically sit.
This statement goes through just as well when applied to missiles, nuclear engineering knowledge, bio-weapons knowledge, etc.
Governments have decided that they wish to use commercial entities as a proxy method for protecting the status quo. If Google wish to challenge that policy then, well, OK. But there is no reasonable argument for making a special exception for "cyber security" over other forms of security-related engineering.
The key difference between those technologies and this is that industrial production hardware (centrifuges or specialized biological equipment, etc) are needed in addition to specialized knowledge of the topic at hand. These additional requirements make export controls a lot more enforceable, and why you can, say, send regulators to the site of a nuclear engineering facility and get a reasonable answer about if they have sent nuclear materials to another country.
With infosec this would be basically impossible, since the specialized knowledge and the equipment are both non-physical and intimately entwined.
Well, if people all over the world were under a constant daily threat getting seriously ill from bio-weapons used by criminals and governments world wide (even against their own people), as is the current global situation in software security, then yes, I would very much make that same statement.
I would even argue that, especially in the case of bio-weapons, it would be a moral imperative to spread knowledge that could cure people.
Remember that (going back to the cybersecurity analogy) the criminals already have the weapons, as well as the knowledge and capability to develop completely novel weapons from scratch (they might even be better at it than the US).
For missiles the argument doesn't really hold, because knowledge of how to detect and protect oneself from missile attacks neither requires, nor strictly includes knowledge of how to actually build and use said missile.
So the two big reasonable arguments are:
- cyber attacks are already widespread, including global criminal organisations targeting civilians
- both cyber offence and cyber defence enabled by the same knowledge of cyber security
No, exploits have the unique property that the "attack" and "defence" information are mirror images of one another.
Also, deployment of nuclear and bio weapons against civilians is against international law, whereas western governments seem to have chosen to deploy offensive hacking themselves rather than attempt to get it banned internationally.
Global companies can go screw themselves, Google in particular. "People should be able to share information globally." is the motto everyone including global companies should rally behind. The crypto export ban is an apt analogy and it did hinder American companies and benefited everyone else. A good thing too when the NSA started monkeying around with crypto primitives and national security letters. Anything that slows down american tech behemoths and gives room for other european alternatives is a good thing.
Previous export restrictions on crypto and certificates gave non-US participants (Mark Shuttleworth) the opportunity to take up the slack and build a business where there were no such restrictions. He simply had to fulfill the demand that existed outside of the US in large part due to export restrictions.
The Wassenaar Arrangement is likely to result in similar unintended effects combined with a similar lack of intended effects.
The UK and Australia have already implemented part or all of these agreements. Does Google already hold licenses in these jurisdictions? If we report a security bug to a Googler in these (or other similarly restricted jurisdictions) will they be able to share security bug details with their overseas Googler colleagues?
If I understand correctly, the UK and Australian implementations are much narrower, and much less problematic than the proposed US implementation. So presumably licenses will not be necessary in those other countries.
It's probably worth remembering that there is an open source exemption. If a piece of 'intrusion software' has been published the export controls no longer apply.
Publish your exploits to github before you send them overseas or travel to the conference to announce them.
> Isn't the right to bear arms an 'inalienable right'?
Not in any American sense of the term. The "unalienable" rights were to life, liberty, and the pursuit of happiness as outlined in the Declaration of Independence. Selling guns to redcoats isn't on that list.
For better or worse, right now the right to bear arms only holds in your house, outside of the 7th Circuit (Illinois, Indiana and Wisconsin), with litigation in the 9th Circuit (the west, specifically California and Hawaii) and D.C. Circuit in progress, and if there are adverse results in those Circuits we're almost positive the Supremes will continue to deny cert.
We in the US pro-gun camp consider self-defense to be an unalienable right (see e.g. the U.K. for a notorious counterexample), but how that applies to exploits is not to my eye simple.
Inalienable rights are human rights - which extend (at least in theory) to foreigners.
I read the story.
But anyway if the Supreme Court ruling holds from Zimmerman it would apply equally well to everything in the article. Of course the Zimmerman case was about foreign exports as well.
While infosec is currently a smaller sector in startups than say social, casual gaming, apps, etc, it's growing furiously and Wassenaar and individual country regs will be top-of-mind for much of the YC community in the years to follow because many of you will be in this space.
I'd like to emphasize one of Google's points: "Global companies should be able to share information globally." With existing laws we are already running into limits on where we can hire and what our own internal staff can send us. So it really is critical for any small or large global org that internal comms are not squashed by this.
BIS's comment period ends today, so if you want to take action, now is the time. As in, before COB today.
One last thing: The Hacking Team compromise and stolen data (including the zero-days they were hoarding) couldn't come at a worse time. It is fuel for the argument that zero-days, vulnerabilities, vectors etc should be tightly regulated and it's really thrown some weight behind the argument to close borders (WRT info exchange) rather than open them. So your help is really needed on this if you think you should be able to have open conversations about technical infosec issues with your colleagues in other countries.
~mark