Summary of “Supreme Court surveillance opinion nudges us to think nationally, act locally”

After deliberating the decision for months, the Supreme Court handed down its opinion in Carpenter v. United States, a case in which the court was asked to answer the question: is it OK for police to obtain 127-days worth of someone’s cell-site location information without a warrant?
In a 5-4 decision, the court found that the answer was “No.” This is clearly a landmark step toward stronger privacy protections, and the opinion builds on two other related cases that the court unanimously decided in 2012 and 2014.
With the court clearly imposing a warrant standard, police just have to do a little more legwork ahead of time, but getting a warrant is not difficult.
Now, while the Supreme Court plays a critical role in helping all of us understand what the law is, it is equally important to remember that privacy advocates big and small cannot afford to wait.
The wheels of justice famously move slowly, and cases often takes several years to reach the Supreme Court, if they ever do.
As Tim Carpenter’s prosecution unfolded, he was eventually convicted at trial, lost on appeal, and finally got to the Supreme Court, which heard oral arguments in October 2017.
Put another way: how much better has surveillance technology improved since Tim Carpenter perpetrated his armed robbery several years ago? With increasingly inexpensive police drones and the advent of companies that are literally called “Persistent Surveillance Systems,” this problem will only get worse.
“A person does not surrender all Fourth Amendment protection by venturing into the public sphere,” a majority of the Supreme Court concluded in Carpenter.

The orginal article.

Summary of “Why Do We Care So Much About Privacy?”

“The Right to Privacy” is where Sarah Igo begins “The Known Citizen”, her mighty effort to tell the story of modern America as a story of anxieties about privacy.
As in Douglas’s dissent, privacy functions as a kind of default right when an injury has been inflicted and no other right seems to suit the case.
Douglas got a second crack at applying his theory of privacy as a constitutional right in 1965, in the case of Griswold v. Connecticut.
“Specific guarantees in the Bill of Rights,” Douglas wrote for the Court, “Have penumbras, formed by emanations from those guarantees that help give them life and substance.” The right to privacy was formed out of such emanations.
People invoke their right to privacy when it serves their interests.
How far the constitutional right to privacy can be made to stretch is the subject of Cyrus Farivar’s lively history of recent Fourth Amendment jurisprudence, “Habeas Data: Privacy vs. the Rise of Surveillance Tech”.
The right to privacy does not attach to property, the Court now said; it attaches to persons.
The Supreme Court found that the use of the device violated Jones’s right to privacy.

The orginal article.

Summary of “Apple Isn’t Your Friend”

There’s likely no other company on Earth that has defined its brand more deftly than Apple.
Back in 2016, Apple took a stand against the FBI’s effort to break into the encrypted iPhone of the San Bernardino shooter and its insistence that it needed a backdoor for Apple’s encryption.
One reason Apple has been relatively good over the years is that it’s a smart company that has stayed in its lane.
In its own deliberate fashion, Apple appears to see a market opportunity in the privacy debate that goes beyond polishing its own image.
What’s troubling is that Apple holds a lot of power as the manufacturer of the devices millions of us use to see ads online, in its app store, and through its Safari browser.
The Wall Street Journal claims that Apple has been meeting with companies like Snap and Pinterest to discuss distributing its ads in their apps.
Given enough leverage, Apple could theoretically make its ads a prerequisite for inclusion in its app store.
If Apple is giving Facebook and Google headaches, we say that’s great.

The orginal article.

Summary of “Edward Snowden: ‘The people are still powerless, but now they’re aware'”

Edward Snowden has no regrets five years on from leaking the biggest cache of top-secret documents in history.
In response to a question from the Guardian about the anniversary, Fleming said GCHQ’s mission was to keep the UK safe: “What Edward Snowden did five years ago was illegal and compromised our ability to do that, causing real and unnecessary damage to the security of the UK and our allies. He should be accountable for that.”
Others in the intelligence community, especially in the US, will grudgingly credit Snowden for starting a much-needed debate about where the line should be drawn between privacy and surveillance.
The former GCHQ director Sir David Omand shared Fleming’s assessment of the damage but admitted Snowden had contributed to the introduction of new legislation.
Ross Anderson, a leading academic specialising in cybersecurity and privacy, sees the Snowden revelations as a seminal moment.
Erson, a professor of security engineering at Cambridge University’s computer laboratory, said: “Snowden’s revelations are one of these flashbulb moments which change the way people look at things. They may not have changed things much in Britain because of our culture for adoring James Bond and all his works. But round the world it brought home to everyone that surveillance really is an issue.”
“The Snowden revelations were a huge shock but they have led to a much greater transparency from some of the agencies about the sort of the things they were doing,” he said.
Developers at major technology companies, outraged by the Snowden disclosures, started pushing back.

The orginal article.

Summary of “Why Is Your Location Data No Longer Private?”

The past month has seen one blockbuster revelation after another about how our mobile phone and broadband providers have been leaking highly sensitive customer information, including real-time location data and customer account details.
In mid-2016, the FCC adopted new privacy rules for all Internet providers that would have required providers to seek opt-in permission from customers before collecting, storing, sharing and selling anything that might be considered sensitive – including Web browsing, application usage and location information, as well as financial and health data.
Worse, the mobile and broadband providers themselves are failing to secure their own customers’ data.
It’s difficult to think of a bigger violation of those principles than the current practice by the major mobile providers of sharing real-time location data on customers with third parties, without any opportunity for customers to opt-in or opt-out of such sharing.
T-Mobile US Inc. in late 2013 announced that its GoSmart Mobile brand had “Become the first wireless providerto offer free access to Facebook and Facebook Messenger for all of its wireless customers, even those without monthly data service.” The GoSmart Mobile plans started at $25 a month for “Unlimited talk” with no other data service.
AT&T Mobility offers a zero-rating plan called “Sponsored Data” that allows content providers to pay up front to have streaming of that content allowed without counting against the provider’s monthly data caps.
SHOCK AND YAWN. When I first saw a Carnegie Mellon University researcher show me last week that he could look up the near-exact location of any mobile number in the United States, I sincerely believed the public would be amazed and horrified at the idea that mobile providers are sharing this real-time data with third party companies, and at the fact that those third parties in turn weren’t doing anything to prevent the abuse of their own systems.
While you’re at it, tell your lawmakers what you think about mobile providers giving or selling third-parties real-time access to customer location information, and let them know that this is no longer okay.

The orginal article.

Summary of “You Can’t Opt Out Of Sharing Your Data, Even If You Didn’t Opt In”

We’re used to thinking about privacy breaches as what happens when we give data about ourselves to a third party, and that data is then stolen from or abused by that third party.
“One of the fascinating things we’ve now walked ourselves into is that companies are valued by the market on the basis of how much user data they have,” said Daniel Kahn Gillmor, senior staff technologist with the ACLU’s Speech, Privacy and Technology Project.
The privacy of the commons is how the 270,000 Facebook users who actually downloaded the “Thisisyourdigitallife” app turned into as many as 87 million users whose data ended up in the hands of a political marketing firm.
Even if you do your searches from a specialized browser, tape over all your webcams and monitor your privacy settings without fail, your personal data has probably still been collected, stored and used in ways you didn’t intend – and don’t even know about.
The information collected every time they scan that loyalty card adds up to something like a medical history, which could later be sold to data brokers or combined with data bought from brokers to paint a fuller picture of a person who never consented to any of this.
The privacy of the commons means that, in some cases, your data is collected in ways you cannot reasonably prevent, no matter how carefully you or anyone you know behaves.
Our digital commons is set up to encourage companies and governments to violate your privacy.
Almost all of our privacy law and policy is framed around the idea of privacy as a personal choice, Cohen said.

The orginal article.

Summary of “One 30-page document contains everything you need to know about AI”

Congress’ complete lack of understanding when it comes things like the internet, cellphones, and technology in general is hilarious…for about a minute.
The Senate’s Zuckerberg hearing was the Congressional equivalent of your parents’ calls to IT, with Zuck throwing out buzzwords like “Data privacy” and “Artificial intelligence” to ensure no one would ask a legitimate follow up.
It was painfully clear that Congress’ understanding of AI doesn’t go very far beyond the general “Smart but scary robot computer” presented in mainstream movies and TV shows.
We’re at a crossroads when it comes to this sort of technology, and it has the potential to fundamentally change the world.
This is where “Privacy and Freedom of Expression In the Age of Artificial Intelligence” comes in.
Written by the members of Article 19, a global human rights organization dedicated to promoting freedom of expression, and Privacy International, an NGO focused on privacy, the paper explains basically everything an informed citizen should know about AI as it relates to democracy, in simple, easy-to-understand language.
It continues on to describe how AI affects freedom of expression and personal privacy, outlines the state of things currently, and provides suggestions as to where we should go from here.
Definitely pull it out on the subway or during your next existential crisis; the future will thank you.

The orginal article.

Summary of “Facebook moves 1.5bn users out of reach of new European privacy law”

Facebook has moved more than 1.5 billion users out of reach of European privacy law, despite a promise from Mark Zuckerberg to apply the “Spirit” of the legislation globally.
A week later, during his hearings in front of the US Congress, Zuckerberg was again asked if he would promise that GDPR’s protections would apply to all Facebook users.
Worldwide, Facebook has rolled out a suite of tools to let users exercise their rights under GDPR, such as downloading and deleting data, and the company’s new consent-gathering controls are similarly universal.
Facebook told Reuters “We apply the same privacy protections everywhere, regardless of whether your agreement is with Facebook Inc or Facebook Ireland”.
It said the change was only carried out “Because EU law requires specific language” in mandated privacy notices, which US law does not.
“This is a major and unprecedented change in the data privacy landscape. The change will amount to the reduction of privacy guarantees and the rights of users, with a number of ramifications, notably for for consent requirements. Users will clearly lose some existing rights, as US standards are lower than those in Europe.”
“Data protection authorities from the countries of the affected users, such as New Zealand and Australia, may want to reassess this situation and analyse the situation. Even if their data privacy regulators are less rapid than those in Europe, this event is giving them a chance to act. Although it is unclear how active they will choose to be, the global privacy regulation landscape is changing, with countries in the world refining their approach. Europe is clearly on the forefront of this competition, but we should expect other countries to eventually catch up.”
That means users will exist in a state of legal superposition: for tax purposes, Facebook will continue to book their revenue through Facebook’s Irish office, but for privacy protections, they will deal with the company’s headquarters in California.

The orginal article.

Summary of “Meet the Woman Who Leads NightWatch, Google’s Internal Privacy Strike Force”

Kissner’s responsibilities include making sure that Google’s infrastructure behaves the way it’s supposed to, transmitting user data securely and not leaving bits of data hanging around in the wrong spots.
Kissner leads a team of 90 employees called NightWatch, which reviews almost all of the products that Google launches for potential privacy flaws.
Products just need a bit of work to pass muster-to meet the standard of what a former colleague of Kissner’s, Yonatan Zunger, calls “Respectful computing.”
Getting a read on people and gauging their trustworthiness was a skill Kissner learned later in life, she says-an uncomfortable experience, but one she immediately applied back into her work.
“I am extremely aware that not everybody experiences the world the way I do. I’m actually surprised when I meet somebody who experiences the world the way I do,” Kissner says.
Kissner certainly isn’t the only woman working in cybersecurity who would probably prefer to be evaluated on the merits of her work.
Kissner will chair one of the conference tracks, Practical Privacy Protection.
Kissner has worked to make sure that diversity is reflected not just at OURSA but at NightWatch.

The orginal article.

Summary of “How to save your privacy from the Internet’s clutches – TechCrunch”

There are some practical steps you can take to limit day-to-day online privacy risks by reducing third party access to your information and shielding more of your digital activity from prying eyes.
Every data misuse scandal shines a bit more light on some very murky practices – which will hopefully generate momentum for rule changes to disinfect data handling processes and strengthen individuals’ privacy by spotlighting trade-offs that have zero justification.
Tell me more: Keyboard apps are a potential privacy minefield given that, if you allow cloud-enabled features, they can be in a position to suck out all the information you’re typing into your device – from passwords to credit card numbers to the private contents of your messages.
Tell me more: Choosing friends based on their choice of messaging app isn’t a great option so real world network effects can often work against privacy.
Tell me more: No connected technology is 100% privacy safe but Apple’s hardware-focused business model means the company’s devices are not engineered to try to harvest user data by default.
Roid is a more open platform than iOS and it’s possible to configure it in many different ways – some of which can be more locked down as regards privacy than others.
Action: Say no to always-on voice assistantsWho is this for: Anyone who values privacy more than gimmickry.
So it’s a great time to write to your reps reminding them you’re far more interested in your privacy being protected than Facebook winning some kind of surveillance arms race with the Chinese.

The orginal article.