The views, opinions and positions expressed by the author of this blog are hers alone, and do not necessarily reflect the views, opinions or positions of Blouin News or Louise Blouin Media.
There is unusual unity in the opposition to the F.B.I. forcing engineers to break their own product in the tech world, despite the wide political differences in the tech community as a whole. Why is that?
Despite the woeful lack of gender diversity and the surreally bad statistics on African Americans in the tech community, there is political diversity. Certainly, there are plenty of Independents, and Democrats. In part, because of the deep historical connections between the military and computer security, there are quite a few Republicans. For reasons I cannot discern, the group that has built its wealth entirely upon the government-invented internet also tends to the disproportionately libertarian. So why the unity?
Engineers protect people. It is what the profession does. From slide rules to apps that make your phone a graphing calculator, engineering is about getting things right. Your car does not blow up. The arena does not collapse. The water mains – assuming that polluted water is not added – deliver water to your home free of disease. Causing the toaster to catch fire requires significant human intervention.
Law enforcement protects people. It stops the bad guys. It identifies the attackers. Usually when it is law enforcement versus some other group, the argument is safety as opposed to the other group: safety vs. money. Safety vs. festival ticket sales. Safety vs. bankers. This is safety vs. safety. Engineers are not willing to yield the mantle of safety to law enforcement. We are not willing to destroy the roads to stop a drunk driver.
A famous phrase in engineering ethics is: take off your engineering hat and put on your management hat.
The result of this – of moving from the engineering perspective to the management point of view – means that when I reference the Space Shuttle Challenger we all recall the disaster resulting from believing in the impossible. The temperature of 18 was a risk, the managers considered it a minor issue. The physical, technical reality was considered a negotiable shade of grey.
Law enforcement first claimed to want a single back door, a specific key for one phone. They wanted a shade of grey that simply does not exist, ignoring the very real risks and consequences. They demanded a master key. That key will then exist, in code that can be copied. Yet there was always another way forward for that one phone, with no legal precedent set. We already had this discussion, as a nation and as technical communities, in the 1990’s. At that point, we were writing crazy things about how someday people might buy so many things over the internet. Our current world of internet-enabled door locks, automobiles, and pacemakers was a gleam in Al Gore’s eye. Yet the risk of breaking the entire internet in order to provide specific content information was too great even then.
We decided not to break the system 20 years ago. We were right. We do not build roads that will collapse beneath our cars should we speed. We do not build medical equipment that fails to work if illegal drugs are detected. We build infrastructure and devices for the safe, reliable performance of their stated goals, not to risk life and limb in order to allow law enforcement ease of use.
This is not an argument of safety versus privacy. The engineers at Apple did their job. Law enforcement did not do their job. The evidence was badly handled. The phone was reset. This is an argument about safety to the extent that the F.B.I. is asking to create systematic risk for easier and cheaper investigations. This means that devices are also secure against law enforcement, particularly when proper police procedures are not followed. Even in the U.S., sometimes the procedures such as due process seem to get short shift.
While the F.B.I. demands that Apple not ship secure phones, the international community is recognizing that it’s surveillance software as much as security software that is dangerous. The Wassenaar Arrangement on Export Controls for Conventional Arms and Dual-Use Goods and Technologies is being renegotiated. Now instead of fear of encryption, nations are realizing that malware and surveillance are being exported to nations with human rights abuses. At the very moment the F.B.I. is demanding built-in surveillance powers, the U.S. is negotiating to prevent the spread of such technology.
If the F.B.I. broke a phone in a manner that could be used by criminals and malicious states to attack the phones of the innocent, it needs to disclose how it was done. It is more important that we be secure as a digital nation than that we be thoroughly policed. If it used a previously unknown vulnerability, it should be disclosed as a policy.
It is safer if the systems that guard our secrets, our photos, and our communication with our families, as well as important financial information, are secure against criminals and attackers. We need better cryptographic locks and stronger digital doors, not a prohibition against virtual curtains.