Charlie, Al, and I were just trying to maintain a sense of humor in fairly tough times, knowing that the first who would see the key would be the folks we were working with at the ministry.
With no judgment whatsoever, may I ask: why did you do the work? Why did you not walk away and stand on moral principle against such intervention? From reading your comments, I gather this isn’t something you agreed with.
The product we built (1984) was for team collaboration & business apps, and we placed a strong bet on crypto for end to end encryption rather than just the illusion of security via access control. It hadn’t been done before. Crypto based products in that era were classified as munitions. They didn’t know what to do with our request; initially they suggested granting us 20 bits for symmetric keys. We got them up to 32. Within a couple years they went to 40. But we used 64 in our US products and wanted to deliver that level globally.
The principled thing to do would be to have never tried to ship our product outside the US, knowing the export regime.
But we had customers who wanted to communicate and collaborate with teams globally. Their alternative was to send everything in the clear, and I wanted to deliver a secure comms system for my global customers.
Ultimately I felt that they were better off with something to protect them - even a compromise - rather than nothing. The bet was that we could get rid of the compromise over time. (Thankfully, the hack was only necessary for a few short years.)
While “working the issue” in DC, I personally came to the viewpoint that most of the policy makers were not inherently evil or stupid; they truly didn’t fully grasp the myriad implications of this new technology. The issues are complex, the pressures great, and it is difficult for them to know how to balance equities. And so, right or wrong, I felt my best bet to change the system was to use methods that I suppose Bruce Lee would call “fighting without fighting”, as opposed to purely principled extremism.
"In the U.S." since "the immediate post WWII period" the "crypto software was included as a Category XIII item into the United States Munitions List." That meant that exporting software with strong encryption was legally the same as exporting weapons.
What they Ray Ozzie and colleagues implemented was at that moment (1996) claimed to be a "superior exportable encryption technology when compared to
other US products on the market":
To be able to export Netscape web browser with SSL (a predecessor of TLS) "Netscape developed two versions of its web browser. The "U.S. edition" supported full size (typically 1024-bit or larger) RSA public keys in combination with full size symmetric keys (secret keys) (128-bit RC4 or 3DES in SSL 3.0 and TLS 1.0). The "International Edition" had its effective key lengths reduced to 512 bits and 40 bits respectively (RSA_EXPORT with 40-bit RC2 or RC4 in SSL 3.0 and TLS 1.0).[6] Acquiring the 'U.S. domestic' version turned out to be sufficient hassle that most computer users, even in the U.S., ended up with the 'International' version,[7] whose weak 40-bit encryption can currently be broken in a matter of days using a single computer."
Only later:
"In January 2000, the U.S. Government relaxed export regulations over certain classes of mass-market encryption products. In line with these changes, Netscape has made the strong-crypto versions of Communicator and Navigator available worldwide." (1)
I appreciate the illustrative nature of your comment, but I'm really looking to learn something from a rather influential mind in software engineering.
"For example, did you really intend to yield your 4th amendment rights when you granted a 3rd party access to your files as a part of Mac Software Update, Windows Update, Virus Scanners, etc., or when you started using a service-tethered smartphone?
Anyway, unlike 'web tracking' issues which seem to be broadly ignored because of our love for ad-supported services, I hope we all (especially the young readers of reddit, hackernews, etc) wake up to the fact that these privacy and transparency issues are REAL, and that they truly will impact you and the country you live in, and that even if you don't consider yourself an activist you really should get informed and form an opinion. Again, this is a non-partisan issue, and let's all work to ensure that it stays this way.
Two great organizations where you can learn are EPIC and EFF. (Disclosure: I am on the board of EPIC.) Take it in, and think. Your contributions are needed and would of course be quite welcome.
That's a super comment and yet not a day goes by here on HN that I read the exact opposite: that privacy is dead and that we should all just roll over and enjoy our Amazon, Google, Apple and Microsoft telemetry because it isn't going to change back.
Since I frequently enjoy reading your comments, I know you don't feel that way ;-) However, probably for the masses it's generally true. We've kind of been here before, though. I mean there was a time where free (as in freedom) software was regarded as a complete joke. Only long bearded hippies used it and doing so was considered to be the high tech version of self-flagellation. Now, thanks to many people (with or without beards) we live in a world where choosing development tools that aren't free software is considered to be almost insane by most. OK, it's not Nirvana, but it's a heck of a lot closer than I ever expected we'd get.
Just as we did in the 1980's, I think we need to hold the fort and keep writing software. We need to write stand-alone applications that the user is in control of. We need to build alternatives to the stampede of needlessly cloud based offerings. And we need to keep chipping away at building federated, distributed applications where isolation is not an option. We have a long way to go, but that's always been the case for those that value software freedom. We should be used to it by now :-)
One final point: I think a lot of people will feel that we are a fringe community and can't possibly make an impact. Mastodon can not topple Twitter. Riot will never touch Facebook. This is probably true, but the more we write code that adheres to our values, the closer we get. We just need to keep chipping away. It is possible that one day at least some significant portion of the population will consider that the use of services controlled centrally by a single corporation is just insane. It might never happen, but if we don't keep building it's guaranteed not to happen.
If you read the discussion here or the previous discussions, you’d see comments from Ray Ozzie, one of the folks behind it, with details on why they did it.
I guess that's...reassuring.