Feinstein-Burr, Ctd

Susan Hennessey, the Managing Editor of Lawfare, chimes in on the discussion of Feinstein-Burr, also known as the “Compliance with Court Orders Act” (CCOA).

Relying exclusively on defining obligations to give technical assistance as a solution to Going Dark solves a fixed—and ever-diminishing—sliver of the problem. As companies move towards stronger systems, they will inevitably reach a point where they cannot meaningfully help at all or cannot do so within a time frame that is responsive the law enforcement needs. While partial solutions may have virtues, technical assistance is not a comprehensive fix now or in the future.

Recognizing this, Burr and Feinstein have apparently decided that if they are going to solve the problem, they are going to solve as much of it as they reasonably can. Thus, CCOA is a form of what Ben calls “the Full Comey“—legislation which sets a performance standard of being able to produce and decrypt information when subject to a particular type of court order. The broader performance standard is then supplemented by an alternative obligation to provide technical assistance to facilitate access to data encrypted by some other party.

It’s actually a pretty straight-forward legislative solution. Certainly some quantity of information subject to a court order will nonetheless remain inaccessible, but the bill covers as much of the terrain as is practicable. But this legislation is not technologically illiterate, as the echo chamber of criticism has convinced itself. Rather, it is rationally constructed to achieve the goals of its drafters. It may be fun to convince yourself that your opponents are illiterate and stupid, but the reason for the disconnect here is not brains; it’s values.

By values, I believe she means the divergent set of goals of the technorati and the government (the latter better characterized as responsibilities, to be honest). But, in the framework of my previous commentary, I continue to wonder if, and even believe, that the CCOA (aka “Feinstein-Burr”) should be reconsidered as a tactic, and a coherent strategy that considers the how to best use a limited set of resources in a technological landscape shaped by commercial needs, mathematical necessities, and human psychology.

I cannot help notice that, on a technical note, multiple levels of encryption and coding may be applied to messages, and if one of those are from a home-made application, or an application which is not determinable from its output, this entire bill may become moot.

For another view on the bill, here’s the CEO of Tozny, a security startup, by the name of Isaac Potoczny-Jones, referenced by Ms Hennessey:

Another amusing aspect of the bill is that it doesn’t just cover encryption. It also includes any data that’s been “encoded, modulated, or obfuscated”.

The process of turning human-readable source code into something that computers can understand often requires encoding it into a binary format. Furthermore, the definition of data includes “information stored on a device designed by a software manufacturer”, which would certainly seem to include the programs stored on that device. Does this require developers to provide source code?

During the FBI vs. Apple situation, the FBI’s had a specifically scoped warrant for a specific phone. Their request was for Apple to modify their OS’s source code to remove certain security features. The FBI could remove those features themselves, but they would more-or-less need Apple’s source code. (They would also need the signing key, but let’s leave aside the question of the signing key for now.)

My reading is that this law would give the FBI a new power to request the OS source code under the scope of a warrant to search a specific phone. They would not need a search warrant issued against Apple.

Bookmark the permalink.

About Hue White

Former BBS operator; software engineer; cat lackey.

Comments are closed.