Summary of “Report: big tech is collecting children’s data at an alarming rate”

Along with those adorable photos, they are sharing crucial data about their children that big tech companies are harvesting.
In late November, Anne Longfield, England’s children’s commissioner – tasked with promoting and protecting the rights of children – published a report titled “Who Knows What About Me,” which examines how big tech collects data on children and what the potential dangers can be.
In the report, Longfield argues that parents are exposing their children’s data at an alarming rate.
The report calls on parents and schools to examine the type of gadgets children play with, like smart speakers, wifi-powered toys, and gaming apps, all of which are collecting data on kids.
Data shared by parents about children is collected at an alarming rate Potential dangers for children no longer just entail speeding cars and strangers with candy.
Smart devices are watching children too – and collecting their data Smart toys have already garnered plenty of criticism for leaving children’s data like location vulnerable.
Longfield writes in the report that “The amount of data inferred about children was of real concern.” Families are now being targeted with products because they are essentially being watched every time they’re online.
What will all this data on children mean for their future? While the report highlights current safety concerns for children’s data privacy, it also mentions some troubling future possibilities.

The orginal article.

Summary of “Google’s “Smart City” in Toronto Faces New Resistance”

Quayside represents a joint effort by the Canadian government agency Waterfront Toronto and Sidewalk Labs, which is owned by Google’s parent company Alphabet Inc., to develop 12 acres of the valuable waterfront just southeast of downtown Toronto.
An applicant for a position at Sidewalk Labs in Toronto was shocked when he was asked in an interview to imagine how, in a smart city, “Voting might be different in the future.”
Toronto Open Smart Cities Forum is taking the lead in the local fight against the commodification of its city’s data.
Cavoukian’s decision to quit represents only the most recent resignation in a series of departures that Wylie has referred to as an “Ongoing bulldozing of stakeholders.” In addition to Cavoukian, a Waterfront Toronto board member and two Waterfront Toronto digital advisers have also resigned in the last five months.
The Sidewalk Labs spokesperson said that the company’s “Relationship with its contractors does not impact its agreements with Waterfront Toronto in any way, including its commitment to the process laid out in the PDA, which says that in the future Waterfront Toronto may have rights to certain Sidewalk Labs IP. Of course, if Sidewalk Labs does not own the IP created by the planning process, it would not have the power to share or convey that IP to Waterfront Toronto or anyone else.”
Most significant among these plans was the suggestion that all data be placed in a “Civic data trust.” On the company’s blog, Alyssa Harvey Dawson, Sidewalk Labs’ Head of Data Governance, explained that with the proposed creation of a civic data trust, no one would have the “Right to own information collected from Quayside’s physical environment – including Sidewalk Labs.” This would represent, she wrote, “a new standard for responsible data use that protects personal privacy and the public interest while enabling companies, researchers, innovators, governments, and civic organizations to improve urban life using urban data.”
“It is as if Uber were to propose regulations on ride-sharing, or Airbnb were to tell city council how to govern short-term rentals. By definition, there is a conflict of interest,” writes Nabeel Ahmed, a smart city expert and member of the Toronto Open Smart Cities Forum.
Part of the mission of the new Toronto Open Smart Cities Forum is to shift the public conversation away from debating the latest minutiae of the company’s proposed terms and toward a broader consideration of whether the project should move forward under any terms at all.

The orginal article.

Summary of “How your privacy gets cooked by those restaurant waitlist apps”

With companies expanding how they suck up that info on a massive scale, it’s no wonder local restaurants want a slice of that pie.
Thousands of restaurants across the country, as well as breweries and pharmacies, are all using waitlist apps, many of which collect data on you and track you to varying degrees.
Now restaurants know how often a specific customer comes to eat, when peak traffic is and how long people are willing to wait for tables.
Like Nowait, Table’s Ready keeps your phone number in its own databases, but restaurant staff can obtain it, Errecart said.
Without a request to Table’s Ready, restaurants still have the last four digits of customers’ phone numbers, which serve as an ID. “The phone number is the unique identifier of a person to see how often they visited, all this information that restaurants really value,” Errecart said.
Myer sees the future of waitlisting apps as a win-win for restaurants and customers, without any concerns of a privacy trade-off.
Restaurants are able to collect data, including how frequently you eat there, and in exchange, you don’t need to wait at the venue for 20 to 30 minutes.
The extra convenience while waiting for a table at the restaurant isn’t worth that privacy trade-off, he added.

The orginal article.

Summary of “Big tech must not reframe digital ethics in its image – TechCrunch”

Ethics are needed to fill the gaps where new uses of data keep pushing in.
So the Facebook founder seized on the conference’s discussion topic of big data ethics and tried to zoom right back out again.
The ‘we’re not perfect and have lots more to learn’ line that also came from both CEOs seems mostly intended to manage regulatory expectation vis-a-vis data protection – and indeed on the wider ethics front.
The growing public and political alarm over how big data platforms stoke addiction and exploit people’s trust and information – and the idea that an overarching framework of not just laws but digital ethics might be needed to control this stuff – dovetails neatly with the alternative track that Apple has been pounding for years.
Though only a handful of tech giants have built unchallengeably massive tracking empires via the systematic exploitation of other people’s data.
“You have to do your homework as a company to think about fairness,” said Elizabeth Denham, when asked ‘who decides what’s fair’ in a data ethics context.
The closed session of the conference produced a declaration on ethics and data in artificial intelligence – setting out a list of guiding principles to act as “Core values to preserve human rights” in the developing AI era – which included concepts like fairness and responsible design.
The consensus from the event is it’s not only possible but vital to engineer ethics into system design from the start whenever you’re doing things with other people’s data.

The orginal article.

Summary of “Midterm elections: How politicians know exactly how you’re going to vote”

Thanks to an army of data crunchers who marry that information with data you drop at a clothing or automobile site, many candidates often have intimate knowledge of who you are and whether you’re likely to support them.
Facebook’s data scandal involving consultancy Cambridge Analytica shed light on how companies can take personal information we give away and transform it into highly effective targeted ads.
While you may be aware your data is being used, you might not know the full extent of the process.
So we dug in to find out how data goes from your voter registration form to data brokers and back to you in the form of a political ad.Voter data in America.
The rules differ when it comes to uses of voter data for purposes other than elections.
The Federal Trade Commission, the US government’s consumer watchdog, produced a report on data brokers in 2014 and recommended that Congress require greater transparency from the data industry.
A RoboCent spokesperson told CNET in July that the company partners with data firms NationBuilder, Aristotle and i360 for voter data.
“The very politicians who fight for consumer data are also using it and not responsible for [where] that data goes to after campaigns,” said Kim Alexander, founder and president of California Voter Foundation.

The orginal article.

Summary of “Inside Europe’s quest to build an unhackable quantum internet”

The laws of quantum physics, on the other hand, allow a particle-for example, an atom, an electron, or a photon of light-to occupy a quantum state that represents a combination of 1 and 0 simultaneously.
Wehner, Hanson, and their colleagues at QuTech aim to overcome these limitations to build a completely secure quantum internet.
Entanglement means creating a pair of qubits-photons of light, for this purpose-in a single quantum state, so that even if they travel off in opposite directions, they retain a quantum connection.
Software used to control classical communication networks can’t cope with things like entanglement, so Wehner has been working on a novel architecture that will make it possible to control the new quantum network efficiently and build applications for it.
At a recent hackathon that QuTech organized jointly with Europe’s regional internet registry, the applications suggested included secure voting, digital signatures, and even a quantum chat service.
The QuTech team seems determined to meet its target of completing the four-city network by the end of 2020, though Wehner admits that deadline is “Super tight.” What they learn will inform a recently launched European project, the Quantum Internet Alliance.
Wehner is coordinating the alliance, whose goal is to “Build a quantum internet that enables quantum communication applications between any two points on Earth.”
Bigger networks are likely to require “Quantum repeaters.” Unlike the “Trusted nodes” in China’s network, which turn quantum information into classical form and then back again, these repeaters, or way stations with quantum processors, will be needed to extend entanglement over thousands of miles so that networks remain impervious to hackers.

The orginal article.

Summary of “Digital immortality: How your life’s data means a version of you could live forever”

Rahnama is creating a digital avatar for the CEO that they both hope could serve as a virtual “Consultant” when the actual CEO is gone.
Some future company executive deciding whether to accept an acquisition bid might pull out her cell phone, open a chat window, and pose the question to the late CEO. The digital avatar, created by an artificial-intelligence platform that analyzes personal data and correspondence, might detect that the CEO had a bad relationship with the acquiring company’s execs.
An entrepreneur and researcher based at Ryerson University in Toronto, and a visiting faculty member at MIT’s Media Lab, he’s building an application called Augmented Eternity; it lets you create a digital persona that can interact with people on your behalf after you’re dead. While most older people haven’t amassed enough digital detritus to build a working artificial intelligence, Rahnama posits that in the next few decades, as we continue to create our digital footprints, millennials will have generated enough data to make it feasible.
You have to know the context in which it was said-was the person joking? Annoyed? Reacting to today’s news? These same kinds of clues end up being crucial when piecing together a digital personality, which is why the Augmented Eternity platform takes data from multiple sources-Facebook, Twitter, messaging apps, and others-and analyzes it for context, emotional content, and semantics.
Should we treat digital remains by the same code that museums use for human remains? Doing so would severely limit the ways in which companies can use our data.
If digital remains are like “The informational corpse of the deceased,” they write, they “May not be used solely as a means to an end, such as profit, but regarded instead as an entity holding an inherent value.”
Hold a black mirror up to natureJust about every discussion of the digital afterlife, Öhman points out, mentions “Be Right Back,” an episode of the British show Black Mirror, in which a bereaved young widow interacts with a digital avatar of her late husband.
The power of the digital dead to manipulate the living is enormous; who better to sell us a product than someone we’ve loved and lost? Thus our digital representations might be more talkative, pushy, and flattering than we are-and if that’s what their makers think is best, who’s going to stop them?

The orginal article.

Summary of “Which Data Skills Do You Actually Need? This 2×2 Matrix Will Tell You.”

Data skills – the skills to turn data into insight and action – are the driver of modern economies.
Which skills should you focus on? Can most of us expect to keep pace with this trend ourselves, or would we be better off retreating to shrinking areas of the economy, leaving data skills to the specialists?
We applied a time-utility analysis to the field of data skills.
In order to help you decide where to focus your development effort, we have plotted key data skills against this framework.
We longlisted skills associated with roles such as: business analyst, data analyst, data scientist, machine learning engineer, or growth hacker.
Finally, we coupled this with information on how difficult the skills are to learn – using time to competence as a metric and assessing the depth and breadth of each skill.
At Filtered, we found that constructing this matrix helped us to make hard decisions about where to focus: at first sight all the skills in our long-list seemed valuable.
Try the matrix in your own company to help your team determine which data skills are most important for them to start learning now.

The orginal article.

Summary of “The smartphone app that can tell you’re depressed before you know it yourself”

There is something most of those people have in common: a smartphone.
Mindstrong Health is using a smartphone app to collect measures of people’s cognition and emotional health as indicated by how they use their phones.
With details gleaned from the app, Mindstrong says, a patient’s doctor or other care manager gets an alert when something may be amiss and can then check in with the patient by sending a message through the app.
Subjects went home with an app that measured the ways they touched their phone’s display, which Dagum hoped would be an unobtrusive way to log these same kinds of behavior on a smartphone.
Brain-disorder treatment has stalled in part because doctors simply don’t know that someone’s having trouble until it’s well advanced; Dagum believes Mindstrong can figure it out much sooner and keep an eye on it 24 hours a day.
In its current form, the Mindstrong app that patients see is fairly sparse.
“There are people who are high utilizers of health care and they’re not getting the benefits, so we’ve got to figure out some way to get them something that works better.” Actually predicting that a patient is headed toward a downward spiral is a harder task, but Dagum believes that having more people using the app over time will help cement patterns in the data.
About 1,500 of the 2,000 participants also let a Mindstrong keyboard app run on their smartphones to collect data about the ways they type and figure out how their cognition changes throughout the year.

The orginal article.

Summary of “3 Ways to Build a Data-Driven Team”

Foster critical thinking: While much of the current discussions around data focus on the role of technology and AI, it is really the human side of the equation that will remain the biggest differentiator for teams and organizations.
As organization turbocharge their ability to gather more and more data – and it’s not so much about size, but rather about quality – what matters most is having people who can ask the right questions to the data.
Although people will differ in their general predisposition towards critical thinking, you can help them develop whatever potential they have if you put in place the right incentives, give people accurate feedback, and establish an informal and non-hierarchical learning culture where people can share views and ideas.
The implications are obvious: if you want your team to embrace, or at least keep up with, the current data revolution, and approach work in a more evidence-based way, you will need to train them.
Many top universities – including the Ivy Leagues – offer free online courses on AI, data visualization, and data science, and leading corporations in this space, such as Google, offer a wide range of free resources and online courses on AI, analytics, and big data.
Hire the right people: When it comes to the training of quantitative, data-driven, or fact-based reasoning skills, there is well-established evidence for the competencies that predict individuals’ likelihood to learn and display these skills.
More specifically, individuals with higher quantitative or numerical ability levels will find it much easier to pick up any training related to data analytics.
This may sound obvious, but the practical implication is that if you want your team to be quantitatively skilled, your best bet is to avoid hiring people with lower levels of numerical reasoning ability.

The orginal article.