> What’s not going to sell, and what the tech industry needs to get over is “lulz, it’ll impossible to intercept military or terrorist information because I need absolute privacy for my saucy emails”. I think it’s been empirically demonstrated that won’t happen.
It's very much the other way. Strong encryption algorithms have been available to the public for a long time now. You can ban using them, but the only way to effectively enforce that ban would be for the government to require that all devices capable of running code from external sources run only code that's signed by that government.
Without that, you can ban all you want, but terrorists and others who need that stuff will have it anyway. So the only effect would indeed be no privacy for saucy emails. Of course, intelligence agencies would love that, since it would allow them to have a society-wide dragnet.
I disagree with your analysis — the way most people receive encryption, including criminals and terrorists, is through a provider. Regulating their behavior does change the general trend in security. Further, forcing them to implement their own encryption increases the likelihood they make a mistake while also refocusing the NSA et al to those algorithms instead.
What we’ve seen is governments subverting encryption and systems repeatedly, in ways they wouldn’t if they had other methods.
I’m not trying to accomplish some absolute ideological position, I’m trying to shift the state of affairs to realign incentives for several players. If some people write their own encryption, or the technologists use GPG everywhere, whatever.
> allow them to have a society-wide dragnet
I don’t think you even read my proposal: the mechanism I proposed makes that impossible, which is in contrast to the current state of affairs, where they subvert the security of the entire system instead of targeted people. Allowing for targeted cracking at a certain level of expense and requiring physical possession of the device in no way enables mass dragnets, and in fact, removes their legal cover by providing alternative means.
I’m not saying people can’t invent their own security — just that factory made safes need to not be “unbreakable”, because it just incentivized bad behavior when they discover a flaw and/or subverting the integrity of the factory.
That’s clearly not how the US (or anywhere) operates: the constitution itself negotiates terms between privacy and security — privacy is not and never has been an absolute right.
Further, I’m actually trying to increase privacy, by negotiating a compromise that’s workable for society as a way to remove the excuses bad actors are using, and shift the legal framework around the topic. That’s not an absolute ideological position, by any definition.
By contrast, you do adopt such an absolutist position — which isn’t grounded in law, and fails to provide for other societal needs. Such stances lead to failure, because of their absolutism. Your stance is why Australia passed an internet wiretapping law, not mine — because you refused to acknowledge a societal need until they employed force.
If your approach worked, we wouldn’t have the state of things we do now.
> Further, I’m actually trying to increase privacy, by negotiating a compromise that’s workable for society as a way to remove the excuses bad actors are using, and shift the legal framework around the topic. That’s not an absolute ideological position, by any definition.
I appreciate the motivation but it's very naive.
Sure, you give them access after 1 month. Next they'll say 1 month is too long, they need to be able to do it within hours to be able to catch criminals before they run to another city. Then it'll be minutes so they can stop crimes while they're happening. Then it'll be real-time on everyone so they can use machine learning to predict crimes minutes before they happen.
> Your stance is why Australia passed an internet wiretapping law, not mine — because you refused to acknowledge a societal need until they employed force.
> If your approach worked, we wouldn’t have the state of things we do now.
They would still have done, and more. You think police and intelligence agencies will one day just say "yeah, that's enough, we're good"? No, they'll want anything that makes their job easier and gives them more power, always.
It's very much the other way. Strong encryption algorithms have been available to the public for a long time now. You can ban using them, but the only way to effectively enforce that ban would be for the government to require that all devices capable of running code from external sources run only code that's signed by that government.
Without that, you can ban all you want, but terrorists and others who need that stuff will have it anyway. So the only effect would indeed be no privacy for saucy emails. Of course, intelligence agencies would love that, since it would allow them to have a society-wide dragnet.