Tuesday, July 16, 2019

Hermaeus Mora

After the war, Robert Oppenheimer remarked that the physicists involved in the Manhattan project had "known sin". Von Neumann's response was that "sometimes someone confesses a sin in order to take credit for it."[132]
What is an exploit if not a daedric word of power?

There's no written history of the start of the US's cyber capabilities, although there are pretend histories out there, filled with random leaked data to reporters of various programs that mattered less than people think they might have. Perhaps this will change in fifty or so years. I think the early nuclear community was best analyzed in this book (The Making of the Atomic Bomb) from Richard Rhodes. And of course, the early cryptoanalytic community has many good books, but we recently reviewed this one on the blog, and it came to the obvious conclusion that the success of the Allies in cryptography depended primarily on the kind of talent the Germans had exiled or repelled.

This blog is a strategy blog, which means occasional drill-downs into the technical details of policy or technology, but ideally, we want to look at the geopolitical trends that give us the ability to predict  the future. That means knowing the history.

But just because there's no written history does not mean there's no history. And one thing I know, without needing to send my blogpost through pre-pub review, is that the early history of Allied Cyber efforts mirrors that of Nuclear and Crypto in that the majority of it was based on the pivotal work of first generation immigrants.

Wednesday, July 10, 2019

International Humanitarian Law and Weird Machines



When you read the International Humanitarian Law work (or export control law, for that matter) in the area of cyber war and cyber-almost-war you get the feeling they are stuck in the 1940's but they are being very precise about it. Part of the difficulty of computers is that even from the very beginning everything was shrouded in the blackest of classified mist, to the point where the Brits didn't announce they had cracked Enigma with the earliest computers for thirty years, and then when they did, a lot of Germans did not believe them.

This means that after the war, Turing and others (c.f. Manhattan project, which was computationally expensive just like codebreaking) were left writing about computation engines they KNEW WORKED and KNEW WERE IMPORTANT but couldn't say why. And computation engines is more the word for electromechanical devices programmed by moving switches and cables around, until von Neumann and others designed architectures (and machines) with what we now know of as RAM.

One Memory to Hold Them All

The key thing in this architecture is that your code is also data in a very practical way. And to take it one step further, both map into a state-space and moving into the weirder parts of that state-space that do what the attacker wants is called "exploitation" (moving state-spaces has nothing to do with executing native code necessarily).

You can see Mike Schmitt and Jeff Biller in their recent paper, pull legal theory towards this reality.  By recasting cyber capabilities as "communication of code" and hence indirect actions which cuts the cord to a lot of international law (some from 1907) that was obviously malformed when talking about iOS exploits.

This is a pretty major step for Mike Schmitt in particular, as the primary defender of the "We can make existing international law fit cyber if we just STRETCH IT LIKE SO" school of thought.  In that sense, the paper is well worth a read even if we told you so.


----

As a bonus, here is the 4d4 Wassenaar export control language for "Intrusion Software" binary-simplified and graphed. Notice how the two items that define it are "can extract or modify data" OR "modification of execution to supply external instructions"? All computer programs do that. Essentially the only technical specficiation that makes any sense is "avoids AV" aka "covertness". It's this kind of regulatory nonsense that is more pain than it could possibly ever be worth but which is generated automatically when the law and regulatory communities are stuck in the 40s.




Monday, July 8, 2019

Book Review: Delusions of Intelligence, R.A. RATCLIFF

So one of my friends told me this story about how he went to an introductory meeting once where a bunch of Americans were presenting to his team (non-Americans) as part of a joint project. And they went down the list, with various people talking about their respective responsibilities for helping with various parts of the project. And he turned to his coworker and he was like "They all seem very smart and very nice, but to be honest, I thought the Americans would be helping more. This is a super high priority project - we have almost fifty of our best people on it full time, and there's only a dozen or so they could spare?" And his friend looked at him for a second and said, "Yes, but this is only the liaison team. They aren't doing the work. Each of these people is responsible for coordinating an entire building in Herndon full of people with our efforts."

On page 76, Delusions of Intelligence, says "Hut 6 alone had 1300 people working at it, and the total people at Bletchley Park was around 10k, while the US Army Signal Security Agency went from 331 to 26k in the same period." But this is the only mention of force strength I can find in the book.  And some quick Googling while losing every single comp game on Overwatch this weekend was not able to determine anything more specific with regards to a ratio of cryptologic efforts between countries in WWII.

It's relevant to the book's conclusions as well.  To paraphrase:
1. The Germans were hopelessly fractured with their cryptologic efforts vs. a unified and centralized British and Allied approach
2. Early success and high-level support (Churchill) allowed for the investment of "big projects" on the side of the Allies to attack difficult problems which the Germans assumed were impossible (so did not even try at)
3. Assuming that cracking mechanical rotor crypto was impossible made a psychological barrier in the Germans that made even major OPSEC lapses on the part of the Allies something to be rationalized away
4. The Germans were obsessed with short term tactical results, and overwhelmed with processing even those. And they assumed mechanical (computational) efforts to aide them would not be fruitful since "cryptography is done with the human mind".
5. The German war effort was entirely military-minded, wheras the Brits had a fluid "Civilian" and "Civilians in whatever uniform made the most sense at the time" approach.

Some of this was said best in Neal Stephenson's Cryptonomicon where he has a character point out that the war was essentially stamped out in the Bletchley Park Huts, or that for the Japanese to tell their superiors that their codes were broken would be so dishonorable that it was impossible to believe, even if the results of it were obvious.

And the corollary to American cyber efforts (fractured, with maximum infighting), are hard not to ignore. The historical picture of a German cypher network getting partial upgrades over time, which if done all at once would have knocked the Allied efforts out, but done piecemeal were ineffective, can only remind you of similar efforts to modernize the USG networks and systems.

To be fair, the book heavily undersells resource constraints and "killing all the smart people seems to be bad for our cryptography team" as causal. 

In any case, ironically this book is only available in paper form, but I highly recommend picking it up for a flight.

Wednesday, July 3, 2019

cybernetics and american conceptual failure

The cybernetics diagram Chris is referring to.


<dave> arg
<dave> I failed at using screen
<dave> it's like hacker 101
<dave> I feel bad
<dave> i pretty much used to ONLY use computers through click-scripts
<dave> which is part of why I never customized anything
<bas> dave: emacs has opened my mind to the non-ascetic tooling lifestyle
<bas> now I'm like "understand? yes please, symbols? yes please, visualization? yes please, hover tooltips? yes please"
<chris> i'll use dave as an example of why americans don't understand cybernetics
<bas> i think you can use dave as an example of why americans don't understand lots of things
<bas> :)
<bas> like "why doesn't anyone care about ants!?"
<chris> So cybernetics spread all over the Soviet Union very rapidly, and in Czechoslovakia, whereas what spread here was systems theory instead of cybernetics.
<chris> SB: How did that happen? It seems like something went kind of awry.
<chris> M: Americans like mechanical machines.
<chris> B: They like tools.
<chris> SB: Material tools more than conceptual tools.
<chris> B: No, because conceptual tools aren’t conceptual tools in America, they’re not part of you.
<chris> that interview is full of treasures
<bas> heh
<chris> so what they're saying is that there is a tendency here for ppl (example is engineers) to focus on the first box
<chris> but not model how the tools they use shape the thoughts they think
<chris> because if you understand cybernetics, then you *do* want agency in that process
<chris> meaning you want to direct your own evolution
<chris> by shaping the tools that end up shaping you
<chris> thus emacs
<bas> feedback loops
<chris> bas: i need to find more ways to link emacs to *
<bas> iterative improvement of workflow and tooling
<bas> in a loop
<miguel> weren't both lisp and emacs created by americans?
<chris> so this is the link: http://www.oikos.org/forgod.htm, the diagram is the crux of the matter
<chris> miguel: there are exceptions to every rule
<dave> chris: Can I paste that whole conversation to my cybersec blog?
<dave> because it's funny
<dave> also: I don't see how I'm the example!
<chris> well example as in you choose not to enter the cybernetic process
<dave> henry, are you sure they NEVER send you any pointers used as unique identifiers?
<dave> I choose not to enter the cybernetic process?
<dave> in terms of, I do not shape my tools, but rather let them shape me?
<chris> by not buying into the emacs paradigm
<dave> ah
<chris> yeh