The Limits of software-only crypto, the feasibilty of meanigful privacy and a Plan B

The latest article by Julian Assange on the New York Times contains very true and insightful analysis, such as:

It is not, as we are asked to believe, that privacy is inherently valuable. It is not. The real reason lies in the calculus of power: the destruction of privacy widens the existing power imbalance between the ruling factions and everyone else,”

and

At their core, companies like Google and Facebook are in the same business as the U.S. government’s National Security Agency. They collect a vast amount of information about people, store it, integrate it and use it to predict individual and group behaviour, …

It contains however what I believe to be very wrong and dangerous representations of the level of privacy assurance that an individual expect by downloading the right software and buying a new cheap laptop. He says:

If there is a modern analogue to Orwell’s “simple” and “democratic weapon,” which “gives claws to the weak” it is cryptography, the basis for the mathematics behind Bitcoin and the best secure communications programs. It is cheap to produce: cryptographic software can be written on a home computer. It is even cheaper to spread: software can be copied in a way that physical objects cannot. But it is also insuperable — the mathematics at the heart of modern cryptography are sound, and can withstand the might of a superpower. The same technologies that allowed the Allies to encrypt their radio communications against Axis intercepts can now be downloaded over a dial-up Internet connection and deployed with a cheap laptop.

In fact, the best free software or proprietary (but verifiable) software crypto solutions have this shortfalls that prevent them to provide meaningful assurance:

  1. Are currently way too complex and non compartmentized enough,  relative to auditing effort,
  2. Do not protect from vulnerabilities in critical part, of both the laptop and USB keys used, that are introduced during design, fabrication or assembly. It is true that some low-cost low-volume laptops out running less common, low-volume and low-performance CPUs may be free from malicious backdoors, but it’s very hard to verify. And the user experience is terrible

Solving such 2 core problems needs extremely-resilient user-accountable organizational processes around certain fabrication and assembly phases, as well as critical server-side components, if any, but also for the standardization, update and auditing processes themselves. In this recent post Cyber-libertarianism vs. Rousseau’s Social Contract in cyberspace, I further argue on the failed assumption of Assange’s approach, that I define cyber-libertarianism, and why solutions can only be non-territorial group based.

Such organizational processes, in turn, have a high degree of geolocatization, and therefore can’t be manage “in the hide”, and so could effectively be made illegal in and/or compromised surreptitiously.

We have a plan to solve all of the above with the User Verified Social Telematics project.

What we propose, may still not deliver meaningful privacy. We expect however that, once it is realised, it’s assurance level will be estimate-able with sufficient precision.

If even UVST, or other similar attempts,  fails, then one possibility we would be bound to test, experiment and evaluate would – before it is too late for freedom and democracy – would be to “flip privacy on power” through sousveillance, by designing a new form of democracy that sacrifices privacy in order maintain freedom and democracy. We’d promote constitutional and legal changes in which instead (almost all) privacy protections would be replaced by mandatory and enforceable all on transparency for all, especially those in power.

After more in depth analysis, such possibility may not work at all. There are in fact many unanswered tech questions, however, about the organizational, policy and tech provisions that will give us sufficient assurance that the powerful are NOT communicating privately (steganography, “code speak”, etc.) while the weak are all naked out there.

To ensure transparency of the power, therefore, it would probably require much of the same extremely-resilient user-accountable organizational processes and techs, that are need to try to achieve meaningful privacy …

This entry was posted in work1. Bookmark the permalink.