I wanted to link to this post on CFR where I wrote about the right way to look at creating new export controls in a complex technical space.
Also, I'll be talking about cyber export controls at the 10th forum on Global Encryption, Cloud, and Cyber Controls, March 24th and 25th in San Fransisco if you want to come heckle.
But more I wanted to write a few words here about the recent Bezos hack, which is something still developing:
A good perspective on the "Civil Society" (ugh, what a phrase) take on this sort of thing is this lawfare article. Like many articles it leans heavily on export control of spyware as the solution to human rights ills. The first thing you'll notice about this, and other policy groups, is that they call for "Transparency", a term which is worth dissecting.
In particular, it is ironic that the FTI report on Bezos's phone is generated with the exact technology they want to control! It is the very definition of dual use! And it is incomplete, because the one thing you do not have on your own iPhone is Transparency, so we do not even know for sure what the exploit was that got KSA (allegedly) onto Bezos's phone. In fact, Apple is currently suing under weird parts of the DMCA a company that does help with transparency, Corellium, after trying to buy them (presumably to stop them from selling their virtualization platform for iOS).
When you hear Transparency from Citizen Lab what they mean is that they want long spreadsheets on basically everyone who buys any dual use software, based on confusing and inexact export control regulations which would strangle small companies who work in this space. This would in theory feed into stricter export control rules, or even domestic legislation. It would probably be easier and better to fix the DMCA and our vision of copyright so everyone can do forensics on their own phones and find out when they get hacked.
It's also worth noting that Israel is not a member of the Wassenaar group of export control nations (nor is China, obviously, although Russia IS a member) and that the Kingdom has extensive offensive resources that go far beyond buying off the shelf exploit toolkits. I did a quick open source Twitter survey a while back after the UAE Project Raven articles came out and all I found was good penetration testing and offensive research teams in the KSA.
Thursday, January 23, 2020
Wednesday, January 15, 2020
Local PrivEscs that are Remote Code Execution
One thing you will notice if you read the NSA advisory yesterday and the Microsoft advisory is that the NSA advisory had MORE information in it. Despite both organizations being "defenders" this is because software vendors have views of the world colored by a completely different view of systemic risk. Sometimes this means advisories get issued for vulnerabilities that are not really exploitable, but typically it means the impact of a vulnerability is vastly underrated. This is presumably why Project Zero releases full details at 90 days, instead of letting the vendor do all public communication, but it's also why most bug bounties include non-disclosure clauses.
In other words, if vendors had their way, an advisory would have less information in it than a fortune cookie.
If you've been in the security research business then you also know that vendors, and often other researchers, will often under-analyze a vulnerability. It's an interesting metric to have to see which bugs got patched, but were called LPEs when you really know they are RCE. Some companies are known to label every remote heap overflow a "crash/DoS", which becomes a funny meme, but also has strategic implications for critical infrastructure.
I guess what I'm trying to say is that a disparity in information is a disparity of control, and nothing leverages this more than an operator in the cyber domain.
In other words, if vendors had their way, an advisory would have less information in it than a fortune cookie.
If you've been in the security research business then you also know that vendors, and often other researchers, will often under-analyze a vulnerability. It's an interesting metric to have to see which bugs got patched, but were called LPEs when you really know they are RCE. Some companies are known to label every remote heap overflow a "crash/DoS", which becomes a funny meme, but also has strategic implications for critical infrastructure.
I guess what I'm trying to say is that a disparity in information is a disparity of control, and nothing leverages this more than an operator in the cyber domain.