So one project that I started working on is called ‘GoogleSharing’. The basic premise of ‘GoogleSharing’ is that this choice that we are given is a false choice, and that we shouldn’t accept it, we should just reject it. So what we should say is, you know, it’s not really possible for us to stop participating with Google, and so instead what we want to do is come up with technical solutions that allow us to continue to participate and still maintain our privacy.
The way it works is it’s a two-part system, it’s a Firefox add-on as well as a custom proxy server. The add-on sits in your web browser and it watches your web requests and all of your non-Google traffic just goes directly to the Internet, totally unmolested, not modified or impacted at all. Then, if it sees a request to a Google service that does not require a login – these are things like ‘Google Search’, ‘Google News’, ‘Google Maps’, ‘Google Groups’, ‘Google Images’, ‘Google Shopping’, but not things like ‘Google Mail’ or ‘Google Checkout’ – then it shuns off that traffic to ‘GoogleSharing’ proxy server. The ‘GoogleSharing’ proxy server maintains a collection of identities, and each identity is a unique HTTP header offset – these are, you know, like the fingerprint of your web browser – as well as a cookie that was issued by Google. And so these are maintained in this pool and every time a request comes in, one of these identities is randomly chosen from the pool and the identifying information from the request is stripped off and the stuff from the identity is tacked on, and then it’s forwarded on to Google who processes the request, responds to the proxy and the information is proxied back to you. So, you know, the upshot is that Google can’t track you because these cookies are constantly moving around and your traffic does not come from your IP address. Additionally, we encrypt this first link using SSL1 between your web browser and the proxy – so that means that actually you can get SSL protection for services that Google does not provide SSL access to. You know, Google now has SSL protected search but none of the other stuff like ‘Maps’ or ‘Shopping’ or ‘Groups’ or ‘Images’ are SSL protected. But you get that with ‘GoogleSharing’. And how does it look? It looks exactly alike.
You can use all their services – ‘Google Maps’, ‘Google News’, whatever you like – totally transparently, the only difference is that in the bottom right-hand corner there is a little status telling you that ‘GoogleSharing’ is enabled. Additionally, Google has sort of given us a win here by allowing us to make SSL protected searches, and so now what we can do is have the client pre-fetch cookies from the ‘GoogleSharing’ proxy server and then make an SSL connection directly to Google. And so now, the ‘GoogleSharing’ proxy server is proxying the data and sharing cookies around and identifying information around but cannot actually see the requests because they are SSL protected all the way to Google. So now you don’t have to trust us not to examine your requests. So anyway, this project is available online, it’s been active for about six months now, we have about 80,000 users and you can get the add-on from from Googlesharing.net.
Another project that I think is interesting is this thing called ‘Facecloak’, it was developed by Prof. Urs Hengartner from the University of Waterloo. The idea, or his basic premise, is that using Facebook what you’re trying to do is share information with your friends or friends of friends. But you’re not actually trying to share information with Facebook and there’s no reason to give them the information. And so he developed this interesting sort of proof-of-concept Firefox add-on that sits in your web browser and anytime you type anything into Facebook you can prefix it with two ‘@’ symbols, and when you do that it will just transparently encrypt it before sending it off to Facebook. And then he has these mechanisms that allow you to really easily share keys with your friends. That way, if they are also running the add-on, then it will transparently decrypt it before displaying it. So everything works just totally the same and everything appears as normal to everybody in your friends group, but Facebook never gets the data. So I think that this is, you know, an interesting project and an interesting theory that I would like to see more of. You could possibly apply it to things like Twitter. You know, Twitter is one part broadcast and one part conversation. For the conversation, there’s no reason for Twitter to actually have the data.
My second thesis there is that the crypto war was largely about data freedom. At that time, you were talking about trying to get information out into the world and other people trying to prevent you from getting that information out into the world. And so it was very easy to extrapolate a future of data control from that. In those moments, it was easy to think that, well, we’re gonna be dealing with this forever, you know. And so a lot of projects were born out of that reality. A lot of the anonymity and privacy projects that we have today are things like darknets, data havens, hidden services. Does anybody here use those things? Two people. Not many, right? Because I think that’s not really the future that we got. You know, we got this other future that we’re talking about, it’s somewhat more subtle, more complicated – this social democracy.
And one thing that I’ve noticed is that privacy advocates – the people who work on these privacy projects – are really in love with ‘the other’. These are people like Iranian dissidents or Chinese dissidents in far-away parts of the world. And I think the interesting thing about it to me is if you look closely – you know, Iranian dissidents or Chinese dissidents – they really have very little in common with what privacy advocates are doing. And yet, they’re still sort of obsessed with these struggles. I guess I would suggest that it seems like that’s because these are the few places in the world that are still speaking that language of data control, that language of information freedom that all of our projects were sort of born out of. And so they just happen to sort of dovetail even though they don’t really connect with the lives of the people that are working on the projects.
And I would say that even those places are beginning to realize that the strategy of data control is not entirely effective. Iran recently announced that they were launching a national email service. You know, unlimited storage, nice web interface… But I think that even they realize that the Google strategy is more effective than their strategy of deep packet inspection and trying to lock everything down.
I also think the loss of the crypto war was less about giving up and more about changing strategies, particularly the strategy of Key escrow, I think, has eventually become a strategy of key disclosure. We’ve seen laws like RIPA2 in the United Kingdom and other parts of Europe that essentially say “Okay, we’re gonna let you use cryptography, we’re not gonna try to regulate that because that’s a really hard problem, we have failed to do that in the past. And so instead, we’re gonna let everybody use whatever cryptography they want but if at any point in the future we want to see what the encrypted traffic is, we’ll come to you and we’ll ask for your key. And if you don’t give it to us, you go to jail”. And so this is a problem of key disclosure. I think that this is a problem of not having forward security in the secure protocols that we’re using today.
Read pevious: Changing threats to privacy: Moxie Marlinspike on privacy threats at Defcon. Part 2.
Read next: Changing threats to privacy 4: Moxie Marlinspike on PGP, OTR encryption and mobile privacy.
1 – SSL (Secure Sockets Layer) is a cryptographic protocol that provides communication security over the Internet. It encrypts the segments of network connections above the Transport Layer, using asymmetric cryptography for key exchange, symmetric encryption for privacy, and message authentication codes for message integrity.
2 – RIPA (Regulation of Investigatory Powers Act) is an Act of the Parliament of the United Kingdom (2000), regulating the powers of public bodies to carry out surveillance and investigation, and covering the interception of communications.