This story was updated with further information about the user data collected by the app.
Opera Software takes its VPN campaign to iOS with a free, unlimited virtual private network app. Launched Monday, the new app follows Opera’s debut in late April of a free, built-in virtual private network in the beta version of its PC and Mac browsers. Opera’s VPN services are offered by SurfEasy, a Canadian VPN provider that Opera acquired in early 2015.
Opera says one reason it decided to offer the app was to help people get around corporate and school firewalls. “Every day, millions of people, from students to working people, find that social-media sites…are blocked when they surf on their campus or workplace Wi-Fi…we help people to break down the barriers of the web,” SurfEasy president Chris Houston said in the iOS app’s announcement.
Opera’s new VPN app will find a formidable opponent in Netflix, however. Since its expansion to pretty much every country on the planet, Netflix has cracked down on VPN use. In my tests, the new Opera app didn’t get around the “great firewall of Netflix.”
That said, if you do run up against other regionally restricted sites you can always give Opera’s VPN a try. Currently, Opera VPN for iOS offers exit servers (where websites think you are) in the U.S., Canada, Germany, the Netherlands, and Singapore.
The new Opera VPN app also includes ad-blocking features to kill online ads, and web trackers that follow your browsing habits online to better target advertising.
Once that’s done, the app starts working automatically. If you want to change exit locations, just tap the lightning bolt icon (upper-right corner on the iPad version). Next choose the country you’d like to “appear” in and that’s it.
The fine print on privacy
Specifically, Opera says it “may” collect usage data while connected to its VPN, including the web addresses you visit. It’s not clear whether the company is actually collecting your web history while using this VPN, or if it’s just reserving the right to do so in some unforeseen future.
We contacted Opera for comment, and received this response from SurfEasy’s Houston: “While the Opera VPN is completely free to the user we do use anonymous market insights derived from customer usage to help support the service. We make this information available to third parties who are interested in better understanding the mobile ecosystem and how it’s evolving.”
In other words, it sells user data to marketers. Houston added, however, that the data is aggregated and not focused on individual activity. “It’s important to understand that this is not data about what you do with your phone, but rather this is data about how a large group of people use their phones.”
We now know the tradeoff for free Windows 10: Microsoft wants data about what you do with your device. But you don’t have to send everything you do back to Redmond.
You can control the data you send back, and how often, by delving into Windows 10’s privacy settings (we’ve taken you here before) and looking specifically at Feedback frequency and Diagnostic and usage data. The former is typically just an automated survey, but the diagnostic component actually peers into your machine.
These features comprised the Customer Experience Improvement Program, or CEIP, in previous versions of Windows—and they were voluntary. In Windows 10 they’ve become mandatory, but you can control some aspects.
Start by going to Settings > Privacy > Feedback & diagnostics in Windows 10.
Changing the Feedback frequency
Every so often, Microsoft gets curious: Did you like this new version of an app? Would you recommend Windows 10 to a friend? Microsoft typically asks these sorts of questions of Insiders who’ve signed up to test Microsoft’s beta software, but regular Windows 10 users may be quizzed as well.
Solicitations for feedback are infrequent. In fact, if you leave the Feedback frequency setting at Automatic, you’ll rarely see a popup. But you may set Feedback to Never if you’re dead-set against ever receiving the prompts.
If, on the other hands, you can’t wait to tell Microsoft what you really think, you can adjust the setting to Once a week, or Once a day, or even Always, so that presumably anything Microsoft has a question about will be flagged for your attention. You can also go to Start > Windows Feedback and use that app to send feedback on a specific issue.
What’s collected for diagnostic and usage data
The diagnostic and usage data that Microsoft wants to collect, however, is much more intrusive. Microsoft won’t know who you are by name, but it does track your device using a unique ID.
“As you use Windows, we collect diagnostic and usage data that helps us identify and troubleshoot problems, improve our products and services, and provide you with personalized experiences,” Microsoft explains in a FAQ. “This data is transmitted to Microsoft and stored with one or more unique identifiers that can help us recognize an individual user on an individual device and understand the device’s service issues and use patterns.”
Here’s the bad news: You can’t turn off diagnostic data in the Settings menu. By default, it’s set to Full, which sends pretty much everything; however, you do have two dialed-back choices called Basic and Enhanced.
The Basic data setting collects the configuration data of your device (device name and model, as well as the hardware and software, including third-party apps and drivers); performance data, including how quickly programs respond to input; network data, including details of the networks you connect to and what radios you’re using; and details of other hardware that’s connected to your device.
Enhanced adds the ability to log “how frequently or how long you use certain features or apps, which apps and features you use most often, how often you use Windows Help and Support, and which services you use to sign into apps,” according to Microsoft. It will also report the memory state of an app when it crashes, helping Microsoft improve the Windows 10 experience. Microsoft cautions that it may collect parts of a document stored in that memory data.
Finally, the Full setting peers even deeper into your PC, but only in certain cases. When devices experience problems that are difficult to diagnose or replicate with Microsoft’s internal testing, Microsoft will randomly select a small number of devices set to the Full level that are also exhibiting the problem, and gather all of the data needed to diagnose and fix the problem. (Note that if you’re a Windows Insider, your Diagnostic setting is automatically set to Full.)
Microsoft apparently doesn’t even anonymize any personal data it collects via its Full diagnostics; it simply won’t use that data for any sales purposes. “If an error report contains personal data, we won’t use that information to identify, contact, or target advertising to you,” Microsoft says.
The data may also travel further than you’d like. Microsoft says its own employees use it, but the company also shares the data with third-party affiliates and hardware partners where relevant.
After successfully launching a version of its browser that offered ad blocking, Opera just won’t quit. On Wednesday night, the company released a free VPN service with unlimited bandwidth, built right into its latest beta. The Opera release is developer edition version 38.0.2204.0 for the Mac and the PC.
Opera also won’t make you pay for the amount of bandwidth that you route through the VPN—which would normally cost you about $48 per year.
A virtual private network spoofs your IP address, pretending that your PC is actually physically located in London, for example, when it’s actually sitting in Los Angeles. That offers all sorts of possibilities: It helps hide your identity when surfing, or allows you access to a website that you normally wouldn’t be able to see. VPNs are also common in countries like China, whose so-called “Great Firewall” insulates the Chinese Internet from the rest of the world.
Of course, a VPN may also enable illicit activities. For years, international users watched Netflix via VPN so they could see movies that weren’t available in their country—until Netflix cracked down. And, of, course, people use VPNs to evade the prying eyes of government watchdogs when downloading data via BitTorrent.
Why this matters: Free, unlimited VPN is an enormous coup for Opera. There are two major questions that Opera will need to answer, though: First, what are the terms of service of the VPN, and the acceptable use policy? “Unlimited” services rarely are. Second: What will the performance of the VPN network (and the browser, too) be under load?
No surprise to Opera watchers
The integrated VPN may not be that surprising if you’ve been watching Opera for long. About a year ago, Opera bought SurfEasy, a Canadian VPN provider whose network Opera is apparently using as the backbone of its services. (A few days ago, SurfEasy promised to protect BitTorrent downloads, possibly preparing for the Opera launch.)
Today, you can take advantage of SurfEasy’s network through downloadable plugins from Chrome and the release version of Opera. Just by signing up with an email address, you’ll receive 500MB of secured data per month, for free. Confirm your email, and you’ll receive 250MB more. Follow them on Twitter, and it’s 100MB more, and so on.
Normally, SurfEasy’s unlimited VPN service costs $3.99 per month and includes support for up to five devices—including Mac and Android devices. Now that the service has been integrated into the developer edition of the Opera browser, however, all of those limitations have apparently gone away.
This is the sound of a beta crashing
Unfortunately, I had one heck of a time getting the developer edition—which, obviously, is far less stable than the release version—to work.
Opera provided me with a test build of the browser, which downloaded and installed just fine. To enable the VPN function, click the Opera (“O”) menu, then scroll down to Settings. Under Privacy & Security, you’ll need to click the checkbox to enable the VPN function. When I did so, I didn’t notice any differences—though I hadn’t tried to surf anywhere yet. I then turned off the ads using the native ad blocker that Opera had installed in a previous edition of the browser, and tested everything on PCWorld’s homepage.
I didn’t notice anything within the interface that signaled whether the VPN service was working. (A popup window in the ad-blocking edition, on the other hand, alerted me that the feature was there, and how and why to take advantage of it.) I closed and restarted the browser.
Unfortunately, that was a mistake. I haven’t been able to open it since, as it promptly crashes on launch. I tried uninstalling it, and received error messages. I tried manually cleaning out the files, removing most of them. After re-downloading and re-installing the browser, though, I still experienced the crash-on-launch bug.
What Opera tells me, however, is that the VPN encrypts data with 256-bit encryption, hiding your actual IP address behind a virtual one. You can select an IP address in the United States, Canada, or Germany. More location options will be available as Opera rolls out this feature in release form.
I did download the SurfEasy VPN plugin for Chrome, and I can report that the service works as advertised, though slowly. Don’t expect to be able to watch an overseas version of Netflix, though: The service reportedly uses rotating IP addresses, and though I was able to log into Netflix UK, only one of three shows actually began playing. (Otherwise, Netflix kindly informed me I was using a VPN, and to cut it out.) The movie that did play was somewhere below 1080p resolution, though the audio was perfectly acceptable.
For now, it’s not clear what will happen for those users who have already signed up for a SurfEasy subscription, and whether Opera will crack down on users who try to download hundreds of gigabytes through the service. Fortunately, though, it won’t cost you a dime to find out. Once Opera fixes a few bugs, download the beta and try it for yourself.
Not wanting to be left behind in the pursuit of enhanced user security, Viber is adding end-to-end encryption (E2EE) following WhatsApp’s E2EE roll out earlier in April. Viber announced on Tuesday that E2EE would roll out to its users globally over the next two weeks. The new encryption will cover text, voice, and group chats, and will work across mobile and PC versions of Viber.
The new feature will be made available to users automatically. You’ll know you have it when you see a lock icon in the text entry box in chats. But Viber’s implementation won’t be as behind-the-scenes as WhatsApp’s is. Instead, the company has added a few extra features for those who want added protection.
When you see a gray lock icon, that means your communication is being protected using the service’s standard E2EE. In addition, each user also has a cryptographic key associated with their device that can be used to authenticate your identity to other Viber users. When this feature’s in use the lock turns green. If it turns red instead, that can mean someone is trying to listen in on your conversation through a man-in-the-middle attack.
However, you’ll probably see a red lock more often when the person you’re talking to switches to a new device. When that happens you’ll need to re-authenticate each other to get the lock icon back to green. We haven’t had a look at Viber’s new encrypted app yet, so we can’t comment on how easy it is to use the service’s new authentication feature.
In addition to E2EE, Viber also introduced a new hidden chats feature that removes chats from your regular logs and protects them behind a PIN lock.
Why this matters: Blame it on the Snowden revelations, the increasing secret demands for personal data by law enforcement, or just plain old hacking. Whatever the reason, more people are concerned about personal online security, and at least some messaging companies would rather not be involved in demands for user data. Apple’s iMessages also offers E2EE, as does Signal, while Line and Telegram offer it as an option. Many other services don’t offer E2EE encryption at all, including major ones like Facebook Messenger, Google Hangouts, Kik, and Snapchat. With so many holdouts we’re not quite at the tipping point for universal E2EE, but it’s getting there.
As if summoned by the Bat-Signal, U.S. Senator Al Franken is seeking answers on Oculus’ privacy policies after some users expressed concerns.
This appears to have prompted an inquiry from Franken, who on Thursday sent and published a letter to Oculus CEO Brendan Iribe. In that letter, Franken asks whether Oculus services require the collection of location data, physical movement data, and communication among Oculus users, and he asks whether Oculus shares this information with third parties for anything other than the provision of services. Franken also asks whether Oculus sells aggregate user data, and what sort of safeguards the company uses to keep user data secure.
“Oculus’ creation of an immersive virtual reality experience is an exciting development, but it remains important to understand the extent to which Oculus may be collecting Americans personal information, including sensitive location data, and sharing that information with third parties,” Franken wrote.
Franken has a long history of sending these types of letters to technology companies, including Apple, Google, Uber, and Samsung. But these companies aren’t obligated to respond, and even when they do, their answers aren’t always particularly insightful. Franken has also tried to introduce location privacy bills several times throughout his tenure, but hasn’t succeeded at passing them into law.
Why this matters: Privacy was a major concern for Oculus’ fans when Facebook acquired the VR firm in 2014, so it’s understandable that they’d be hypersensitive about the Rift’s terms of service. Now that the Rift is a real product, it’s reasonable to expect a plain-English explanation of what Oculus will do with all the data it’s able to collect.
Oculus has basically responded already
Although Oculus has not yet answered Franken’s letter, the company has responded directly to the VR community, so it seems likely that Franken will get a similar response.
In a statement to UploadVR earlier this week, Oculus said it is “thinking about privacy every step of the way,” adding that it collects user data to check device stability, address technical issues, and improve the experience overall.
As for advertising, Oculus said it is relying on Facebook for some infrastructure elements, but is not sharing information with the social networking giant, at least for now. “We don’t have advertising yet and Facebook is not using Oculus data for advertising—though these are things we may consider in the future,” the company said.
Slinging your credit card information all over the web may be the norm when you’re online shopping, but playing fast and loose with those precious numbers is just begging for identity theft to happen. A new company dubbed Privacy.com thinks it has a solution to the problem. Instead of handing out your actual debit and credit card numbers, Privacy.com lets you create “virtual” debit cards that are locked for use with a single vendor, or “burner” cards that are valid only for one-time use.
If no one has your actual credit card, the thinking goes, then your credentials are safe from the next major database breach—or the one after that.
That basic idea has already gained interest from investors. The company announced in October that it had raised $1.2 million from investors, including Jim Messina, former White House deputy chief of staff and main driver of President Obama’s 2012 re-election campaign. And the company’s founders include Andy Roth, the former chief privacy officer for American Express.
Privacy.com is free to use and makes its money by taking a cut from the interchange fees that merchants pay to Visa and the banks. It works primarily as a web app in Chrome and Firefox (Safari and Internet Explorer support is coming soon), but there’s an iOS app too. There’s also a handy Chrome extension that can auto-detect payment forms to create a new temporary card in a few clicks without leaving the page.
Why this matters: Privacy.com is another example of the Internet coming up with solutions that just aren’t practical in the physical world. Having multiple cards linked to your bank account and locked to specific vendors is a good way of reducing credit card fraud. An individual card is far less useful to thieves if all it can do is buy Netflix subscriptions or video games on Steam. Creating that system with plastic cards would be far too costly. A computer, however, can generate a card number and get into the payment system in seconds.
How it works
The sign-up process for Privacy.com is very simple. You start with an email and password, then add your name, address, and date of birth on the next screen. Finally, you connect your bank account to your Privacy.com account by handing over your banking account’s username and password. Once that’s done you’re on your way.
You read that correctly. Right now, you can’t use Privacy.com by connecting it to your debit card or using details from the bottom of a check. Only your bank login credentials will do.
“We’re planning to add [debit card and check sign-ups] as funding options later,” Privacy.com CEO Bo Jiang told PCWorld via email. “But instant account verification (bank login) was the fastest and lowest friction way of doing so. It also helps us reduce fraud.”
The company says your login details are “passed to your bank over a secure TLS (SSL) connection.” The company also says it is Payment Card Industry (PCI) compliant and uses a 256-bit encryption key to secure your details.
Requesting your bank details isn’t uncommon among some financial services. Intuit’s Mint.com, for example, also asks for your bank login details when adding an account.
Bottom line: If you’re not comfortable handing over your bank credentials to Privacy.com then this service is not for you.
Once you’re up and running, it’s simple to create virtual debit cards. First, you’ll be prompted to install the browser extension and sign in using your Privacy.com credentials. Right now, Privacy.com only has a Chrome extension, but one for Firefox is coming.
Before you start creating cards, I’d strongly advise clicking Account at the top of the Privacy.com dashboard and enabling Two-Factor Authentication. For that, you’ll need an authenticator app on your smartphone, such as Google Authenticator or the recently released LastPass Authenticator. This adds an extra layer of security to your account that makes it much harder for hackers to break in.
Now, it’s time to create a card. When I first tried the service, I only had an option to create my first virtual card using the browser extension. Once I’d done that I could create cards using the web app. Jiang says this is a bug that should already be fixed.
To make a card, click the Create Card button on the web app or select Create a New Card in the browser extension. If you have two-factor authentication enabled—and again, you should—then you’ll be asked to enter a TFA token.
Once that’s done, you’ll hit the interface for creating your single-merchant card with several options. Click the dollar sign icon to set a purchase limit. If you don’t set a limit the card will top out at $1,000 for the day and $2,000 for the month. Click the flame icon and you’ve created a one-time use burner card. You’ll also want to select the text cursor and give your card a memorable name like “Netflix ‘n’ chill.”
Once you’ve adjusted the card to your liking, click Create card and it will be ready in a few seconds, complete with expiry date and three-digit security code.
That’s about all there is to Privacy.com. It’s a fast, simple way to keep your actual debit cards out of the hands of online retailers with virtual plastic that is locked to a specific merchant. Handy!
As the discussion focuses on privacy and crime, what is mostly lost is an analysis of the potential business and government implications—not merely the impact to Apple, technology vendors, and law enforcement agencies, but the effects to the wider business community and daily operation of thousands of agencies at all levels of government. Taken from that point of view, the President’s statement could become, “… it’s fetishizing the investigation of a limited set of highly serious crimes above every other value.”
Day to day I work as an IT security industry analyst. Formerly a research vice president at Gartner, where I was the lead analyst for datacenter encryption, I now run my own firm. For the past 15 years, I have advised some of the largest companies and government agencies in the world on using encryption systems. I’ve written multiple research papers, and I continue to work with most of the major encryption technology vendors.
Knowing how encryption is used throughout the business world, it is clear that one of our most fundamental security tools is at the center of a civil rights debate, and the slightest misstep could set back corporate and government security by decades.
Encryption is technology’s backbone, and we break it all the time
Encryption is ubiquitous in the digital world. We use it for every credit card transaction, every time we unlock a car with a key fob, every time we log into nearly anything with a password, visit a secure website, connect to a wireless network, update software, or do pretty much anything with a bank. Society relies on encryption for far more than merely protecting our phones and online chats.
Encryption is merely math, not sorcery. It is a heavily studied field of math with an extensive body of work in the public domain. The U.S. government once restricted the export of strong encryption products, forcing companies to use weaker versions overseas and support the weaker encryption here at home since the Internet doesn’t respect national boundaries. It’s a decision we still pay the price for daily, as earlier this year researchers discovered yet another vulnerability in about a third of the Internet directly due to this deliberate weakening back in the 1990s.
The U.S. government backed down on the battle for encryption because it was essential to running businesses and government services over the Internet. Attempts to allow encryption outside the country only in a weakened state left everyone vulnerable to attack since domestic systems also needed to support the lower security levels. The remnants of those early attempts are still having repercussions decades later.
Even without restrictions on encryption, the proper implementation is difficult. When I authored a paper on defending enterprise data on iOS 7, I had to describe how to best work around Apple’s incomplete encryption—the very holes that started this debate, and were later closed in iOS 8.
The Department of Justice, in their latest brief, states, “This burden, which is not unreasonable, is the direct result of Apple’s deliberate marketing decision to engineer its products so that the government cannot search them, even with a warrant.” That statement is an outright falsehood disguised as wishful thinking. Improving the encryption of iOS 8 was a security decision, one lauded by IT security departments everywhere, who had long been encrypting laptops to an equal standard.
Every golden key is a skeleton key
In his South by Southwest speech, President Obama stated, “I suspect the answer will come down to how we create a system where the encryption is as strong as possible, the key is as secure as possible, it’s accessible by the smallest number of people possible, for the subset of issues that we agree is important.”
There are existing techniques to enable third-party access to strongly encrypted systems. One widely used method uses an alternate key to decrypt data. Businesses will often support more than one key for a piece of data or a computer for various reasons, such as ensuring an IT department can still recover a corporate system if an employee tries to lock them out.
Apple and other technology providers could use this well-known method to allow government access to systems. The truth is this can be done relatively securely. We know how to keep incredibly sensitive encryption keys secure. It typically involves multiple people holding only fragments of the total key, extensive physical security, and non-networked systems. Ignoring the international privacy considerations, and the impact on these technology providers’ international business operations, if such a system was created and used in rare circumstances, it is highly unlikely to be broken.
The problem is it is impossible to scale this kind of system. First, if the FBI truly wants to eliminate warrant-proof (properly encrypted) storage and communications they would need the key for every encrypted product and service on the Internet. They would need highly secure mechanisms for every software developer and hardware manufacturer to provide their keys. Since that is completely unworkable, perhaps only major manufacturers and developers over a certain size would have to participate.
Then there’s the issue of access. Does only the FBI get to use the system for terrorism cases? Do local law enforcement officers get access to catch child predators? Drug dealers? Could this be limited only to the U.S.? Or would other countries, including ones, like China, that the U.S. government itself publicly accuses of hacking corporate systems, also gain access or require their own alternate keys? These are legitimate and complex questions, not mere aggrandized slippery slope arguments. The more access there is to a key, the more often it is used, the less secure it is by definition.
Ignoring the privacy concerns, the impact on business and government systems (and thus operations) could become crippling.
The impact on devices
When I advise companies on properly encrypting laptops, aside from the complexities in key management, I have to guide them through all the potential weaknesses. For example, I tell them if they are crossing certain international borders or keep highly sensitive information on a Mac they might lose physical control of to ensure the system is always shut down, not put to sleep, because encryption keys are often stored in nonvolitile RAM, leaving the Mac vulnerable.
This isn’t paranoia. We know for a fact that certain governments hack corporations (and other governments), and a stolen laptop can be a great source of information. The same is true for industrial espionage (it’s real) or targeted criminal attacks. Corporations spend many millions of dollars to secure mobile computers using enterprise encryption software, and millions more on managing secure phones and tablets.
If the FBI mandates alternate decryption keys for all devices, those keys would potentially need to be generated for all corporate systems, not just consumer phones. If such a law didn’t apply to laptops, that would be an easy way to skirt the requirement. If it does, then the government gains direct access to all those systems, and complex key-exchange mechanisms would need to be created and every business or government agency that encrypts would have to provide recovery keys.
Then how would companies handle international operations? Or international companies with workers in the U.S.? This is before we even get into the issue of other nations requiring their own access keys. One outcome could be that internationally encrypted devices are inaccessible by the U.S., and U.S. systems are safe in other countries—unless the governments cooperate in major cases and exchange evidence, which isn’t unprecedented.
If the scope is limited to just phones, and only in the U.S., and only for terrorism and a few other cases, the risk and burden to U.S. companies would possibly be manageable. But based on the stated objectives of the FBI and President Obama, it is reasonable to assume the scope is wider, and it is hard to imagine that only the U.S. would mandate a golden key, and only for phones. Even without some malicious hackers stealing the keys, the end result is corporate devices, especially those used with international travel, could no longer be considered secure in many real-world situations.
The impact on communications and the Internet
In previous statements, FBI director James Comey also expressed concern with encrypted communications, like iMessage, where the government can’t access the key. Businesses depend on secure communications on multiple levels, ranging from employee communications to secure transactions with partners and services.
With some of these systems the government can mandate backdoor access, forcing the provider like Apple or Facebook to keep records of communications, or at least have the ability to sniff communications when required.
But not all these systems are centralized. Enterprises commonly set up their own hosted communications systems since they don’t trust an external service providers or for regulatory reasons. If a tool like iMessage requires access, what about VPNs? Secure connections to websites and email servers? Secure messaging systems? Secure file transfer systems? Financial transaction systems that run over the Internet?
All of these rely on the exact same set of foundational technologies, and all are abused by criminals every day. Worrying they may be within regulatory scope isn’t much of a mental stretch.
There are thousands of systems and technologies out there, and few lines between those used by businesses and the general public. If the bad guys switch from the providers known to work with the government to the open source and commercial technologies used by business, those systems will likely also have to support government access. That means backdoors and recovery keys, since there isn’t any known alternative.
This brings us back to the same problems we have with devices. We simply don’t have scalable mechanisms to support lawful access without reducing security. There is a very real risk that secure communications on multiple levels could be deeply compromised and result in real criminal losses. And that’s before we start worrying about foreign governments.
The impact on data centers and applications
The strongest encryption in the corporate world isn’t found in phones, but in data centers. Enterprises commonly use specialized security appliances designed as unbreakable safes for encryption keys and operations. These Hardware Security Modules, or HSMs, secure banks, retailers, and even your iCloud Keychain backups. Access requires smart cards (sometimes multiple cards held by different employees), and physical tampering can trigger failsafe deletion of all the stored keys.
If you don’t want to buy an HSM, you can always rent one from one of multiple major cloud providers. They aren’t cheap, but provide the ultimate in security since not even the cloud provider can access your data.
That’s merely one example of the strong encryption tools absolutely essential for secure data centers and applications. This equipment and these tools aren’t the kinds of things you can pick up at Best Buy, but they are certainly within the budgets of terrorists and a range of criminals. They are more secure than iPhones and can easily be used to build storage and communications systems. We use them for encrypted financial and medical databases, secure file storage, or even to keep those little CVV codes on the back of your credit card safe.
If these tools remain legal for enterprise, the odds are they will be used by nefarious groups to avoid government monitoring of consumer tech. If businesses are required to add back doors and golden keys too, we once again undermine the foundation for digital security.
The decision is binary, not absolutist
The President and the director of the FBI have portrayed this conflict as one between privacy absolutists and government compromise. The issue is that the technology itself forces us to make a binary decision. There are no known techniques for providing lawful access to encrypted communications and storage at scale. The only way to allow government access is to reduce the security of foundational technologies used by business and government agencies, not merely individual citizens. That is math, not politics.
Further complicating the situation is that security constantly evolves, and we continue to adopt ever stronger technologies in more situations simply to stop the criminals, including hostile governments. These aren’t outlandish movie scenarios; they are the painful, expensive reality for every business in the world. The only difference between consumer, corporate, and government technologies are the price tags. Restrictions on these improvements could be catastrophic.
Even as citizens need law enforcement to protect themselves in the digital world, all policy-makers, companies, researchers, individuals, and law enforcement have an obligation to work to make our global information infrastructure more secure, trustworthy, and resilient. This report’s analysis of law enforcement demands for exceptional access to private communications and data shows that such access will open doors through which criminals and malicious nation-states can attack the very individuals law enforcement seeks to defend. The costs would be substantial, the damage to innovation severe, and the consequences to economic growth difficult to predict.
Everything in my experience supports their findings. I can’t think of any way to allow government access for criminal and national security situations that wouldn’t undermine the foundations of digital security across the board. Even ignoring the massive complexities if these requirements were instituted globally, unless the government required access to every possible encryption technology, it would be trivial for criminals and terrorists to hide, while dramatically increasing the risks to nearly all businesses and government agencies.
President Barack Obama can’t comment on the specifics of the ongoing feud between Apple and the FBI, but he did sit down with Texas Tribune editor-in-chief Evan Smith at South by Southwest Interactive on Friday to weigh in on one of the most pressing issues facing American society today: Is national security more important than privacy in the digital age?
“The question we now have to ask is if technologically it is possible to make an impenetrable device or system where the encryption is so strong there’s no key, there’s no door at all, then how do we apprehend the child pornographer? How do we disrupt a terrorist plot?” Obama said. “If you can’t crack that [device] at all, if government can’t get in, everybody’s walking around with a Swiss bank account in their pocket.”
Obama is the first sitting president to take the stage at South by Southwest, the annual converge of tech, music, and film in Austin, Texas. He appeared at the festival to urge tech companies, engineers, and the creative thinkers drawn to SXSW to work on innovative solutions plaguing American democracy, like making it easier to vote, and bringing Internet access to more people.
Those are important issues, of course, but with the Department of Justice pressing Apple to help unlock an iPhone 5c used in the San Bernardino terrorist plot, Obama’s feel-good message on civic engagement took a backseat to who he sides with, Apple or FBI. He wouldn’t say, of course, but said he came down on the side of civil liberties, with a caveat.
“I suspect the answer will come down to how we create a system where the encryption is as strong as possible, the key is as secure as possible, it’s accessible by the smallest number of people possible for the subset of issues that we agree is important.”
The Edward Snowden effect
Obama realizes that Edward Snowden’s NSA surveillance leaks have made the American people skeptical about the government’s intentions when it comes to our devices.
“There are very real reasons why we want to make sure the government cannot just willy-nilly go into everyone’s iPhones—smartphones—that are full of personal data,” he said. “The whole Snowden disclosure episode elevated people’s suspicions of this.”
Obama said, “The Snowden issue vastly overstated the dangers to U.S. citizens in terms of spying,” but also said encryption is essential to keep hackers from destroying digital systems like banks or air traffic control.
“We’re going to have to make some decisions about how we balance those respective risks,” he said. “We’ve engaged the tech community aggressively to help solve this problem. You can’t take an absolutist stance on this. It’s fetishizing our phones above every other value, and that can’t be the right answer.”
Meanwhile, Apple faces off with the FBI in court on March 22 for the first hearing in the case. The Department of Justice filed a response on Thursday to Apple’s argument againt complying with the court order and basically slammed the company for its “corrosive” rhetoric. Expect this fight to stay heated.
When it comes to online privacy, Mozilla’s open-source Firefox browser is probably the best choice for keeping your data away from prying eyes. Even though Mozilla does have some behavior-based advertising on its new tab page, it’s still by far the browser maker that most respects your right to browse unmolested.
Nevertheless, Firefox does require several tweaks if you want to avoid privacy-invading tactics like ad tracking. Here’s a rundown of the basic steps you can take in this browser.
Do not track and tracking protection
To get started, open the preferences tab by typing about:preferences#privacy into the address bar. Or type about:preferences and choose Privacy in the left-hand navigation panel.
First up in the privacy section is tracking. By default, Firefox does not enable the do-not-track feature. You turn it on by clicking the checkbox labeled “Request that sites not track you.”
With this feature enabled, Firefox will make a request to every website you visit that they do not track you. Unfortunately sites don’t have to honor the request, and few do. To enforce your intentions you need to use an add-on such as Ghostery or the Electronic Frontier Foundation’s Privacy Badger. Be further warned, however, that some sites are choosing not to allow people to access content with add-ons like these enabled.
Returning to the tracking section in Firefox, there’s a relatively new feature enabled by default called “tracking protection in private windows.” Leave this setting turned on. The new enhanced tracking protection blocks ads and other online trackers when you’re in private browsing mode.
Reconciling with history
By default, Firefox remembers your history, which makes it easier to return to a site you visited a day, week, or even a month ago. Click the drop-down menu labeled “Firefox will:” and you can also tell the browser to never remember your history (the scorched-earth option), or use custom settings. Selecting the latter brings up several new options. At the top is a checkbox for “Always use private browsing mode,” which is another hardcore privacy choice to make. You can find out the full implications of private browsing mode on Mozilla’s support pages.
Below that are a variety of options that are pretty straightforward, but here’s how I would suggest setting it up.
Check the box for remembering your browsing and download history, un-check remembering search and form history, and leave the box checked for “Accept cookies from sites.”
Then under “Accept third-party cookies” leave it set as Always, but change “Keep until:” from they expire to I close Firefox. Finally check the box that says “Clear history when Firefox closes.”
This combination of settings allows Firefox to behave normally, but it erases most of your activity once you close the program. It offers some measure of privacy without sacrificing functionality.
Finally we get to the Location Bar settings. The Location Bar is just Mozilla’s special name for the browser’s address bar.
If you don’t want Mozilla to suggest sites based on your history, bookmarks, or open tabs, un-check the corresponding boxes.
Those are the main settings you’ll want to take care of, but there are a few more options you should tweak before we’re done. Click Search in the left-hand navigation panel of the Preferences tab.
At the top of the search section is a drop-down menu for your default search engine. Choose whatever you’d like, but the most privacy-conscious search provider is DuckDuckGo. It’s worth trying for a few days—you can always switch back later.
Back at the main Settings menu, go to Advanced > Data Choices. This option shares information with Mozilla about your browser’s performance and any crash reports. It’s up to you whether to share this data with Mozilla.
Finally, open a new tab and click the settings cog in the upper right corner. By default, you’re using the New Tab page, which has a small amount of advertising on it. The easiest option is to choose to show a blank page. For something a little more personal, install the New Tab Override add-on.
Most Americans think that Apple should help the FBI unlock a smartphone used by one of the terrorists in the San Bernardino mass shooting, according to a study released Monday by the Pew Research Center.
Fifty-one percent of those asked said they think Apple should unlock the iPhone to help the FBI with its investigation, while 38 percent said it should not unlock the phone to protect the security of its other users. Eleven percent of respondents had no opinion either way.
Depending on how you look at it, that could suggest only a small majority side with the FBI (51 percent versus 49 percent who oppose it or are undecided), or it could suggest a clear majority in the FBI’s favor (51 percent to 38 percent).
Pew tends to favor the 51 percent to 38 percent comparison, said Alec Tyson, a senior researcher at Pew. “It’s a fairly complex issue, and replying ‘I don’t know’ is a perfectly legitimate response,” he said. Of those who do have an opinion, most clearly side with the FBI.
The survey quizzed 1,002 adults by telephone between Feb. 18 and Feb. 21, half via cell phone and half on a landline.
Early last week, a magistrate judge ordered Apple to modify its iOS software so that the FBI can bypass security protections on the phone’s lock screen to access the data inside. Investigators say the phone could possibly hold clues to finding more terrorists.
Apple is fighting back and will appeal the order. Modifying the software would weaken security for all users, it said, putting them at risk of data theft and other crimes.
The FBI insists the modification would apply only to the iPhone used in San Bernardino, but Apple says the order could open the door to more invasive requests in future.
“Should the government be allowed to order us to create other capabilities for surveillance purposes, such as recording conversations or location tracking? This would set a very dangerous precedent,” CEO Tim Cook wrote in a Q&A on Apple’s website.
The stand-off has been making headline news, and the Pew survey found that 75 percent of those asked had heard either a lot (39 percent) or a little (36 percent) about the issue.
That’s a high level of awareness compared to other studies, Tyson said, suggesting people are paying attention.
If the public is leaning in the FBI’s favor, it shouldn’t be a surprise, he said. “In general over recent years when it comes to anti-terrorism efforts, we find the public tends to prioritize keeping the country safe over concerns about civil liberties.”
That wasn’t so clearly the case following the 2013 Edward Snowden revelations about National Security Agency surveillance, but the pendulum has swung the other way since the rise of ISIS, Tyson said.
To illustrate, he pointed to another Pew study conducted in December. “Public concerns that anti-terrorism policies have gone too far in restricting civil liberties have fallen to their lowest level in five years (28 percent),” Pew said then. “Twice as many (56 percent) now say their greater concern is that these policies have not gone far enough to adequately protect the country.”
Given that reality, a showing of 38 percent in support of Apple might not be a bad result for the company, especially given that there are nuances to the argument Apple is trying to make.
Another factor is that while many people favor security in the abstract, they might be less willing to see their own personal data put at risk, Tyson said. According to the latest study, among those who personally own an iPhone, the views were more evenly divided, with 47 percent saying Apple should help unlock the phone, and 43 percent saying it should not.
Whatever the public’s view, it shouldn’t influence the outcome of Apple’s legal case. “The courts should not be swayed at all,” said Susan Hennessey, managing editor of Lawfare and a former attorney in the Office of General Counsel at the NSA.
But it could influence future legislation in this area.
“Certainly public opinion is enormously important for future legislative efforts,” she said.