LAST MONTH, I met Edward Snowden in a hotel in central Moscow, just blocks away from Red Square. It was the first time we’d met in person; he first emailed me nearly two years earlier, and we eventually created an encrypted channel to journalists Laura Poitras and Glenn Greenwald, to whom Snowden would disclose overreaching mass surveillance by the National Security Agency and its British equivalent, GCHQ.
This time around, Snowden’s anonymity was gone; the world knew who he was, much of what he’d leaked, and that he’d been living in exile in Moscow, where he’s been stranded ever since the State Department canceled his passport while he was en route to Latin America. His situation was more stable, the threats against him a bit easier to predict. So I approached my 2015 Snowden meeting with less paranoia than was warranted in 2013, and with a little more attention to physical security, since this time our communications would not be confined to the internet.
Our first meeting would be in the hotel lobby, and I arrived with all my important electronic gear in tow. I had powered down my smartphone and placed it in a “faraday bag” designed to block all radio emissions. This, in turn, was tucked inside my backpack next to my laptop (which I configured and hardened specifically for traveling to Russia), also powered off. Both electronic devices stored their data in encrypted form, but disk encryption isn’t perfect, and leaving these in my hotel room seemed like an invitation to tampering.
Most of the lobby seats were taken by well-dressed Russians sipping cocktails. I planted myself on an empty couch off in a nook hidden from most of the action and from the only security camera I could spot. Snowden had told me I’d have to wait awhile before he met me, and for a moment I wondered if I was being watched: A bearded man wearing glasses and a trench coat stood a few feet from me, apparently doing nothing aside from staring at a stained-glass window. Later he shifted from one side of my couch to the other, walking away just after I made eye contact.
Eventually, Snowden appeared. We smiled and said good to see you, and then walked up the spiral staircase near the elevator to the room where I would be conducting the interview, before we really started talking.
It also turns out that I didn’t need to be quite so cautious. Later, he told me to feel free to take out my phone so I could coordinate a rendezvous with some mutual friends who were in town. Operational security, or “opsec,” was a recurring theme across our several chats in Moscow.
In most of Snowden’s interviews he speaks broadly about the importance of privacy, surveillance reform, and encryption. But he rarely has the opportunity to delve into the details and help people of all technical backgrounds understand opsec and begin to strengthen their own security and privacy. He and I mutually agreed that our interview would focus more on nerdy computer talk and less on politics, because we’re both nerds and not many of his interviews get to be like that. I believe he wanted to use our chats to promote cool projects and to educate people. For example, Snowden had mentioned prior to our in-person meeting that he had tweeted about the Tor anonymity system and was surprised by how many people thought it was some big government trap. He wanted to fix those kinds of misconceptions.
Our interview, conducted over room-service hamburgers, started with the basics.

Micah Lee: What are some operational security practices you think everyone should adopt? Just useful stuff for average people.
Edward Snowden: [Opsec] is important even if you’re not worried about the NSA. Because when you think about who the victims of surveillance are, on a day-to-day basis, you’re thinking about people who are in abusive spousal relationships, you’re thinking about people who are concerned about stalkers, you’re thinking about children who are concerned about their parents overhearing things. It’s to reclaim a level of privacy.
  • The first step that anyone could take is to encrypt their phone calls and their text messages. You can do that through the smartphone app Signal, by Open Whisper Systems. It’s free, and you can just download it immediately. And anybody you’re talking to now, their communications, if it’s intercepted, can’t be read by adversaries. [Signal is available for iOS and Android, and, unlike a lot of security tools, is very easy to use.]
  • You should encrypt your hard disk, so that if your computer is stolen the information isn’t obtainable to an adversary — pictures, where you live, where you work, where your kids are, where you go to school. [I’ve written a guide to encrypting your disk on Windows, Mac, and Linux.]
  • Use a password manager. One of the main things that gets people’s private information exposed, not necessarily to the most powerful adversaries, but to the most common ones, are data dumps. Your credentials may be revealed because some service you stopped using in 2007 gets hacked, and your password that you were using for that one site also works for your Gmail account. A password manager allows you to create unique passwords for every site that are unbreakable, but you don’t have the burden of memorizing them. [The password manager KeePassX is free, open source, cross-platform, and never stores anything in the cloud.]
  • The other thing there is two-factor authentication. The value of this is if someone does steal your password, or it’s left or exposed somewhere … [two-factor authentication] allows the provider to send you a secondary means of authentication — a text message or something like that. [If you enable two-factor authentication, an attacker needs both your password as the first factor and a physical device, like your phone, as your second factor, to login to your account. Gmail, Facebook, Twitter, Dropbox, GitHub, Battle.net, and tons of other services all support two-factor authentication.]
We should not live lives as if we are electronically naked.
We should armor ourselves using systems we can rely on every day. This doesn’t need to be an extraordinary lifestyle change. It doesn’t have to be something that is disruptive. It should be invisible, it should be atmospheric, it should be something that happens painlessly, effortlessly. This is why I like apps like Signal, because they’re low friction. It doesn’t require you to re-order your life. It doesn’t require you to change your method of communications. You can use it right now to talk to your friends.
DSC_0650-color-1
Micah Lee and Edward Snowden, Moscow, Russia.
Photo: Sue Gardner
Lee: What do you think about Tor? Do you think that everyone should be familiar with it, or do you think that it’s only a use-it-if-you-need-it thing?
Snowden: I think Tor is the most important privacy-enhancing technology project being used today. I use Tor personally all the time. We know it works from at least one anecdotal case that’s fairly familiar to most people at this point. That’s not to say that Tor is bulletproof. What Tor does is it provides a measure of security and allows you to disassociate your physical location. …
But the basic idea, the concept of Tor that is so valuable, is that it’s run by volunteers. Anyone can create a new node on the network, whether it’s an entry node, a middle router, or an exit point, on the basis of their willingness to accept some risk. The voluntary nature of this network means that it is survivable, it’s resistant, it’s flexible.
[Tor Browser is a great way to selectively use Tor to look something up and not leave a trace that you did it. It can also help bypass censorship when you’re on a network where certain sites are blocked. If you want to get more involved, you can volunteer to run your own Tor node, as I do, and support the diversity of the Tor network.]
Lee: So that is all stuff that everybody should be doing. What about people who have exceptional threat models, like future intelligence-community whistleblowers, and other people who have nation-state adversaries? Maybe journalists, in some cases, or activists, or people like that?
Snowden: So the first answer is that you can’t learn this from a single article. The needs of every individual in a high-risk environment are different. And the capabilities of the adversary are constantly improving. The tooling changes as well.
What really matters is to be conscious of the principles of compromise. How can the adversary, in general, gain access to information that is sensitive to you? What kinds of things do you need to protect? Because of course you don’t need to hide everything from the adversary. You don’t need to live a paranoid life, off the grid, in hiding, in the woods in Montana.
What we do need to protect are the facts of our activities, our beliefs, and our lives that could be used against us in manners that are contrary to our interests. So when we think about this for whistleblowers, for example, if you witnessed some kind of wrongdoing and you need to reveal this information, and you believe there are people that want to interfere with that, you need to think about how to compartmentalize that.
Tell no one who doesn’t need to know. [Lindsay Mills, Snowden’s girlfriend of several years, didn’t know that he had been collecting documents to leak to journalists until she heard about it on the news, like everyone else.]
When we talk about whistleblowers and what to do, you want to think about tools for protecting your identity, protecting the existence of the relationship from any type of conventional communication system. You want to use something like SecureDrop, over the Tor network, so there is no connection between the computer that you are using at the time — preferably with a non-persistent operating system like Tails, so you’ve left no forensic trace on the machine you’re using, which hopefully is a disposable machine that you can get rid of afterward, that can’t be found in a raid, that can’t be analyzed or anything like that — so that the only outcome of your operational activities are the stories reported by the journalists. [SecureDrop is a whistleblower submission system. Here is a guide to using The Intercept’s SecureDrop server as safely as possible.]
And this is to be sure that whoever has been engaging in this wrongdoing cannot distract from the controversy by pointing to your physical identity. Instead they have to deal with the facts of the controversy rather than the actors that are involved in it.
Lee: What about for people who are, like, in a repressive regime and are trying to …
Snowden: Use Tor.
Lee: Use Tor?
Snowden: If you’re not using Tor you’re doing it wrong. Now, there is a counterpoint here where the use of privacy-enhancing technologies in certain areas can actually single you out for additional surveillance through the exercise of repressive measures. This is why it’s so critical for developers who are working on security-enhancing tools to not make their protocols stand out.
Lee: So you mentioned that what you want to spread are the principles of operational security. And you mentioned some of them, like need-to-know, compartmentalization. Can you talk more about what are the principles of operating securely?
Snowden: Almost every principle of operating security is to think about vulnerability. Think about what the risks of compromise are and how to mitigate them. In every step, in every action, in every point involved, in every point of decision, you have to stop and reflect and think, “What would be the impact if my adversary were aware of my activities?” If that impact is something that’s not survivable, either you have to change or refrain from that activity, you have to mitigate that through some kind of tools or system to protect the information and reduce the risk of compromise, or ultimately, you have to accept the risk of discovery and have a plan to mitigate the response. Because sometimes you can’t always keep something secret, but you can plan your response.
Lee: Are there principles of operational security that you think would be applicable to everyday life?
Snowden: Yes, that’s selective sharing. Everybody doesn’t need to know everything about us. Your friend doesn’t need to know what pharmacy you go to. Facebook doesn’t need to know your password security questions. You don’t need to have your mother’s maiden name on your Facebook page, if that’s what you use for recovering your password on Gmail. The idea here is that sharing is OK, but it should always be voluntary. It should be thoughtful, it should be things that are mutually beneficial to people that you’re sharing with, and these aren’t things that are simply taken from you.
If you interact with the internet … the typical methods of communication today betray you silently, quietly, invisibly, at every click. At every page that you land on, information is being stolen. It’s being collected, intercepted, analyzed, and stored by governments, foreign and domestic, and by companies. You can reduce this by taking a few key steps. Basic things. If information is being collected about you, make sure it’s being done in a voluntary way.
For example, if you use browser plugins like HTTPS Everywhere by EFF, you can try to enforce secure encrypted communications so your data is not being passed in transit electronically naked.
Lee: Do you think people should use adblock software?
Snowden: Yes.
Everybody should be running adblock software, if only from a safety perspective …
We’ve seen internet providers like ComcastAT&T, or whoever it is, insert their own ads into your plaintext http connections. … As long as service providers are serving ads with active content that require the use of Javascript to display, that have some kind of active content like Flash embedded in it, anything that can be a vector for attack in your web browser — you should be actively trying to block these. Because if the service provider is not working to protect the sanctity of the relationship between reader and publisher, you have not just a right but a duty to take every effort to protect yourself in response. Lee: Nice. So there’s a lot of esoteric attacks that you hear about in the media. There’s disk encryption attacks like evil maid attacks, and cold-boot attacks. There’s all sorts of firmware attacks. There’s BadUSB and BadBIOS, and baseband attacks on cellphones. All of these are probably unlikely to happen to many people very often. Is this something people should be concerned about? How do you go about deciding if you personally should be concerned about this sort of attack and try to defend against it?
Snowden: It all comes down to personal evaluation of your personal threat model, right? That is the bottom line of what operational security is about. You have to assess the risk of compromise. On the basis of that determine how much effort needs to be invested into mitigating that risk.
Now in the case of cold-boot attacks and things like that, there are many things you can do. For example, cold-boot attacks can be defeated by never leaving your machine unattended. This is something that is not important for the vast majority of users, because most people don’t need to worry about someone sneaking in when their machine is unattended. … There is the evil maid attack, which can be protected against by keeping your bootloader physically on you, but wearing it as a necklace, for example, on an external USB device.
You’ve got BadBIOS. You can protect against this by dumping your BIOS, hashing it (hopefully not with SHA1 anymore), and simply comparing your BIOS. In theory, if it’s owned badly enough you need to do this externally. You need to dump it using a JTAG or some kind of reader to make sure that it actually matches, if you don’t trust your operating system.
There’s a counter to every attack. The idea is you can play the cat-and-mouse game forever.
You can go to any depth, you can drive yourself crazy thinking about bugs in the walls and cameras in the ceiling. Or you can think about what are the most realistic threats in your current situation? And on that basis take some activity to mitigate the most realistic threats. In that case, for most people, that’s going to be very simple things. That’s going to be using a safe browser. That’s going to be disabling scripts and active content, ideally using a virtual machine or some other form of sandboxed browser, where if there’s a compromise it’s not persistent. [I recently wrote about how to set up virtual machines.] And making sure that your regular day-to-day communications are being selectively shared through encrypted means. Lee: What sort of security tools are you currently excited about? What are you finding interesting?
Snowden: I’ll just namecheck Qubes here, just because it’s interesting. I’m really excited about Qubes because the idea of VM-separating machines, requiring expensive, costly sandbox escapes to get persistence on a machine, is a big step up in terms of burdening the attacker with greater resource and sophistication requirements for maintaining a compromise. I’d love to see them continue this project. I’d love to see them make it more accessible and much more secure. [You can read more about how to use Qubes here and here.]
Something that we haven’t seen that we need to see is a greater hardening of the overall kernels of every operating system through things like grsecurity [a set of patches to improve Linux security], but unfortunately there’s a big usability gap between the capabilities that are out there, that are possible, and what is attainable for the average user.
Lee: People use smartphones a lot. What do you think about using a smartphone for secure communications?
Snowden: Something that people forget about cellphones in general, of any type, is that you’re leaving a permanent record of all of your physical locations as you move around. … The problem with cellphones is they’re basically always talking about you, even when you’re not using them. That’s not to say that everyone should burn their cellphones … but you have to think about the context for your usage. Are you carrying a device that, by virtue of simply having it on your person, places you in a historic record in a place that you don’t want to be associated with, even if it’s something as simple as your place of worship?
Lee: There are tons of software developers out there that would love to figure out how to end mass surveillance. What should they be doing with their time?
Snowden: Mixed routing is one of the most important things that we need in terms of regular infrastructure because we haven’t solved the problem of how to divorce the content of communication from the fact that it has occurred at all. To have real privacy you have to have both. Not just what you talked to your mother about, but the fact that you talked to your mother at all. …
The problem with communications today is that the internet service provider knows exactly who you are. They know exactly where you live. They know what your credit card number is, when you last paid, how much it was.
You should be able to buy a pile of internet the same way you buy a bottle of water.
We need means of engaging in private connections to the internet. We need ways of engaging in private communications. We need mechanisms affording for private associations. And ultimately, we need ways to engage in private payment and shipping, which are the basis of trade. These are research questions that need to be resolved. We need to find a way to protect the rights that we ourselves inherited for the next generation. If we don’t, today we’re standing at a fork in the road that divides between an open society and a controlled system. If we don’t do anything about this, people will look back at this moment and they’ll say, why did you let that happen? Do you want to live in a quantified world? Where not only is the content of every conversation, not only are the movements of every person known, but even the location of all the objects are known? Where the book that you leant to a friend leaves a record that they have read it? These things might be useful capabilities that provide value to society, but that’s only going to be a net good if we’re able to mitigate the impact of our activity, of our sharing, of our openness.
Lee: Ideally, governments around the world shouldn’t be spying on everybody. But that’s not really the case, so where do you think — what do you think the way to solve this problem is? Do you think it’s all just encrypting everything, or do you think that trying to get Congress to pass new laws and trying to do policy stuff is equally as important? Where do you think the balance is between tech and policy to combat mass surveillance? And what do you think that Congress should do, or that people should be urging Congress to do?
Snowden: I think reform comes with many faces. There’s legal reform, there’s statutory reform more generally, there are the products and outcomes of judicial decisions. … In the United States it has been held that these programs of mass surveillance, which were implemented secretly without the knowledge or the consent of the public, violate our rights, that they went too far, that they should end. And they have been modified or changed as a result. But there are many other programs, and many other countries, where these reforms have not yet had the impact that is so vital to free society. And in these contexts, in these situations, I believe that we do — as a community, as an open society, whether we’re talking about ordinary citizens or the technological community specifically — we have to look for ways of enforcing human rights through any means.
That can be through technology, that can be through politics, that can be through voting, that can be through behavior. But technology is, of all of these things, perhaps the quickest and most promising means through which we can respond to the greatest violations of human rights in a manner that is not dependent on every single legislative body on the planet to reform itself at the same time, which is probably somewhat optimistic to hope for. We would be instead able to create systems … that enforce and guarantee the rights that are necessary to maintain a free and open society.
Lee: On a different note — people said I should ask about Twitter — how long have you had a Twitter account for?
Snowden: Two weeks.
Lee: How many followers do you have?
Snowden: A million and a half, I think.
Lee: That’s a lot of followers. How are you liking being a Twitter user so far?
Snowden: I’m trying very hard not to mess up.
Lee: You’ve been tweeting a lot lately, including in the middle of the night Moscow time.
Snowden: Ha. I make no secret about the fact that I live on Eastern Standard Time. The majority of my work and associations, my political activism, still occurs in my home, in the United States. So it only really make sense that I work on the same hours.
Lee: Do you feel like Twitter is sucking away all your time? I mean I kind of have Twitter open all day long and I sometimes get sucked into flame wars. How is it affecting you?
Snowden: There were a few days when people kept tweeting cats for almost an entire day. And I know I shouldn’t, I have a lot of work to do, but I just couldn’t stop looking at them.
Lee: The real question is, what was your Twitter handle before this? Because you were obviously on Twitter. You know all the ins and outs.
Snowden: I can neither confirm nor deny the existence of other Twitter accounts.
Disclosure: Snowden and I are both directors of Freedom of the Press Foundation.