- The National Security Agency (NSA) has reportedly shied away from questions about tech “back doors” that it wants tech companies to build for it, according to Reuters.
- Back doors are a kind of special access into software systems that bypass typical security protocols and allow a third party, like the NSA, to see encrypted data.
- Critics say the practice is a danger to U.S. national security.
On the morning of December 2, 2015, two domestic terrorists opened fire at the Inland Regional Center in San Bernadino, California, killing 14 people and injuring 22 others. It represented one of the deadliest mass shooting in California history, second only to the San Ysidro McDonald’s massacre in 1984.
In the aftermath, the Federal Bureau of Investigation (FBI) wanted access to as much evidence as possible, remaining adamant that Apple should either fork over personal data from one of the assailants’ iPhones, or else create a so-called “back door” in the company’s iOS software. Apple refused, citing security concerns—such a workaround could be leaked or stolen, compromising iPhones around the globe, the company said.
➡ DON’T LET TECH TRICK YOU. Master your digital world with best-in-class explainers and unlimited access to Pop Mech, starting NOW.
The National Security Agency (NSA) has reportedly ducked questions about similar requests for special access into enterprise-level software systems. Last fall, Ron Wyden, a Democrat from Oregon who serves on the Senate Intelligence Committee, told Reuters the government “shouldn’t have any role in planting secret back doors in encryption technology used by Americans.”
Proponents for technology back doors say they’re useful to intelligence investigations, like intercepting communications within terrorist organizations. Critics, meanwhile, say this kind of access can lead to problematic data collection practices, like looking through large amounts of information without a warrant. Plus, they say, back doors could compromise domestic tech to espionage, which is already a major fear with Chinese software used on U.S. soil.
So, what exactly are software back doors, how do they work, and do they have a place in U.S. tech companies’ products?
What Is a Tech Back Door?
In computing, a tech back door is a general term for any program that has a structure to bypass traditional security measures. That allows an individual to access encrypted data or sensitive business systems.
According to the cybersecurity marketing firm TechTarget, there are two primary kinds of back doors: intentional and unintentional.
The first may be an administrative choice. For example, a developer could knowingly create a back door so it’s simpler to test and troubleshoot an application. Or, a company could be working closely with a government bureau and wish to have a scheme for easily giving out access to a program’s encrypted data.
Unintentional backdoors, meanwhile, could be the result of a genuine programming error, or a cybersecurity attack. Hackers could set up a two-prong attack wherein they first create a vulnerability in an information system, and then later come back with a worm or virus that can take advantage of that new back door.
Regardless of whether the back door is intentional or unintentional, these workarounds are dangerous from a security perspective.
What Are the Stakes?
As it turns out, the U.S. isn’t alone in requesting that tech companies build software back doors into their products for them. Last year, Japan and India, plus the member nations within the Five Eyes intelligence-sharing alliance—which includes the U.S., Canada, Australia, New Zealand, and the United Kingdom—signed a public letter calling on tech companies to implement back doors that would help law enforcement to gain access to encrypted data.
In the letter, the consortium says encryption technology “plays a crucial role in protecting personal data, privacy, intellectual property, trade secrets and cybersecurity,” but there are “particular implementations” of encryption tech that preclude law enforcement access, which can in turn threaten public safety and vulnerable populations, like exploited children.
“We urge industry to address our serious concerns where encryption is applied in a way that wholly precludes any legal access to content,” the alliance writes.
Specifically, these seven countries are calling for:
➡️ System designs that can facilitate government investigations.
➡️ Access to “readable and usable” content from such systems in cases where law enforcement is authorized.
➡️ Engagement between tech companies, governments, and other stakeholders, so design choices can be made in such a way that this access can be granted.
That sounds noble enough, but tech companies say back doors fly in the face of end-to-end encryption—essentially making that privacy framework null and void. As a refresher, end-to-end encryption is a protocol that scrambles the contents of a message or some other snippet of data, rendering it completely useless without its accompanying decryption key, which unshuffles the jargon.
Popular examples include the Facebook-owned WhatsApp, a messaging app that uses end-to-end encryption to protect users’ chat conversations from hackers, and Apple’s iCloud platform. Under a regular encryption framework, the companies hold the cryptographic encryption key, which opens up a sort of trap door for governments or adversaries to get their hands on the key and break the encryption.
With end-to-end encryption, though, only the end computer–the one the user in question accesses to send and receive data—holds the encryption key. By definition, if a company creates a back door for the government, this protection scheme is broken.
In a 2015 interview with the Washington Post, Alex Stamos, the former chief security officer for both Yahoo and Facebook, said it would be about the equivalent of “drilling a hole in the windshield” of a car.
What Does This Have To Do With China?
This content is imported from YouTube. You may be able to find the same content in another format, or you may be able to find more information, at their web site.
In this play, the main characters are the U.S. government, which is begging for special access into enterprise systems, and the tech companies, which are arguing that such a back door would be an overstep, with potentially disastrous consequences for user privacy. But there’s yet another star in this show, one that both the government and tech companies see as the antagonist: China.
Huawei, a mammoth Chinese tech firm known for its affordable smartphones and telecommunications equipment, can reportedly gain access into cell phone network infrastructure it’s helped to build. Namely, this is through software back doors intended for government use, according to a report in The Wall Street Journal.
This could allow Huawei to access sensitive information through base stations, antennas, and switching gear, a national security nightmare.
Naturally, Huawei isn’t a fan of this characterization of its back doors. In a 2020 YouTube video, the company specifically calls out Bojan Pancevski, the author of the aforementioned Wall Street Journal report, and U.S. National Security Advisor Robert O’Brien, as participating in an “ongoing campaign to discredit” Huawei.
Zachary Overline, a content creator for Huawei, says in the video that a lot of people really love the term “back door,” but “the people who use it the most, and they use it the loudest, they’re not really the ones who understand technology all that well … they like it because it sounds scary.”
He delineates between a few different types of doors:
➡️ Lawful intervention: Usually with a court order, governments can gain access to network communications and data traffic pertaining to a given criminal. Overline refers to this as a sort of “front door.”
➡️ Network service: Service providers can get authorized, one-time limited access to a network—sort of like a service door that a worker would use to make upgrades or complete maintenance. Programs monitor every keystroke that workers make while working in these systems.
➡️ Malicious backdoor: A back door can be installed either intentionally or unintentionally through a vulnerability. In 2013, Edward Snowden blew the whistle on the NSA for pressuring tech companies into installing these sorts of back doors, which can appear anywhere in a program.
It’s hard to say who to believe when these conversations are classified and not available to the general public. But one thing is for sure: governments aren’t stopping the pressure campaigns anytime soon.
🎥 Now Watch This:
This content is created and maintained by a third party, and imported onto this page to help users provide their email addresses. You may be able to find more information about this and similar content at piano.io