Summary of “Facebook security chief rants about misguided “algorithm” backlash”

The issue isn’t that Facebook doesn’t know algorithms can be biased or that people don’t know these are tough problems, but that the company didn’t anticipate abuses of the platform and work harder to build algorithms or human moderation processes that could block fake news and fraudulent ad buys before they impacted the 2016 U.S. presidential election, instead of now.
He was the Chief Information Security Officer as Yahoo before taking the CSO role at Facebook in mid-2015.
The sprawling response to recent backlash comes right as Facebook starts making the changes it should have implemented before the election.
Yesterday, Facebook updated an October 2nd blog post about disclosing Russian-bought election interference ads to congress to note that “Of the more than 3,000 ads that we have shared with Congress, 5% appeared on Instagram. About $6,700 was spent on these ads”, implicating Facebook’s photo-sharing acquisition in the scandal for the first time.
Understanding of the risks of algorithms is what’s kept Facebook from over-aggressively implementing them in ways that could have led to censorship, which is responsible but doesn’t solve the urgent problem of abuse at hand.
Even though Facebook prints money, some datasets are still too big to hire enough people to review manually, so Stamos believes algorithms are an unavoidable tool.
No one is calling for Facebook to be haphazard with the creation of these algorithms.
Facebook neded to think long and hard about how its systems could be abused if speech wasn’t controlled in any way and fake news or ads were used to sway elections.

The orginal article.

Summary of “‘Our minds can be hijacked’: the tech insiders who fear a smartphone dystopia”

Facebook’s “Like” feature was, Rosenstein says, “Wildly” successful: engagement soared as people enjoyed the short-term boost they got from giving or receiving social affirmation, while Facebook harvested valuable data about the preferences of users that could be sold to advertisers.
Harris, who has been branded “The closest thing Silicon Valley has to a conscience”, insists that billions of people have little choice over whether they use these now ubiquitous technologies, and are largely unaware of the invisible ways in which a small number of people in Silicon Valley are shaping their lives.
A graduate of Stanford University, Harris studied under BJ Fogg, a behavioural psychologist revered in tech circles for mastering the ways technological design can be used to persuade people.
“A handful of people, working at a handful of technology companies, through their choices will steer what a billion people are thinking today,” he said at a recent TED talk in Vancouver.
Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “Likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored.
A friend at Facebook told Harris that designers initially decided the notification icon, which alerts people to new activity such as “Friend requests” or “Likes”, should be blue.
He identifies the advent of the smartphone as a turning point, raising the stakes in an arms race for people’s attention.
“The people who run Facebook and Google are good people, whose well-intentioned strategies have led to horrific unintended consequences,” he says.

The orginal article.

Summary of “Facebook’s Harm Is Taking Life Out of Context”

These incidents no longer seem like accidents, but what’s the right framework for thinking about the underlying failings of Facebook? I have a nomination: For all the wonders of contemporary technology, it is not so good at producing social context.
It probably wouldn’t go well, in large part because the surrounding social context would make it clear what was going on, namely a clumsy attempt to boost a foreign cause for self-serving reasons.
So how do these recent incidents tie into the longstanding complaints from the tech critics? Arguably Facebook is making it too easy for us to be superficially sociable, at the expense of deeper social cultural context.
People have hardly stopped listening to music, but music is less moored to our social attachments, and it doesn’t seem to have the cultural force or social influence or political meaning of earlier times.
What we’ve done is strip away a lot of the social context and broader meaning surrounding those connections, in part because we no longer need music to signal our aspirations and our social standing.
As with the Russian propaganda, that too is a problem of missing social context.
When social context was front and center, as in the older world of mainstream media, fake news was harder to pull off.
In essence, Facebook makes it too easy for us to communicate without the background social production of context.

The orginal article.

Summary of “Facebook’s war on free will”

In the hands of Google and Facebook, these algorithms grew ever more powerful.
Here’s a brief explanation for the sliver of humanity who have apparently resisted Facebook: the news feed provides a reverse chronological index of all the status updates, articles and photos that your friends have posted to Facebook.
So Facebook makes its own choices about what should be read. The company’s algorithms sort the thousands of things a Facebook user could possibly see down to a smaller batch of choice items.
Facebook’s algorithm couldn’t be more opaque.
There’s no doubting the emotional and psychological power possessed by Facebook – or, at least, Facebook doesn’t doubt it.
Facebook has even touted the results from these experiments in peer-reviewed journals: “It is possible that more of the 0.60% growth in turnout between 2006 and 2010 might have been caused by a single message on Facebook,” said one study published in Nature in 2012.
In the meantime, Facebook will keep probing – constantly testing to see what we crave and what we ignore, a never-ending campaign to improve Facebook’s capacity to give us the things that we want and things we don’t even know we want.
Facebook would never put it this way, but algorithms are meant to erode free will, to relieve humans of the burden of choosing, to nudge them in the right direction.

The orginal article.

Summary of “The Super-Aggregators and the Russians – Stratechery by Ben Thompson”

Mark Warner, the senior Senator from Virginia, is referring to a Russian company, thought to be linked to the Kremlin’s propaganda efforts, having bought $100,000 worth of political ads on Facebook, some number of which directly mentioned 2016 presidential candidates Donald Trump and Hillary Clinton.
Facebook has released limited details about the ads, likely due to its 2012 consent decree with the FTC, which bars the company from unilaterally making private information public, as well as the problematic precedent of releasing information without a clear order compelling said release.
What makes Facebook and Google unique is that not only do they have zero transaction costs when it comes to serving end users, they also have zero transaction costs when it comes to both suppliers and advertisers.
There is still one more thing that separates Facebook and Google from the rest: advertisers.
Rather, the companies that will be hurt are those seeking to knock Google and Facebook off their perch; given that they are not yet super-aggregators, they will not have the feedback loops in place to overcome overly prescriptive regulation such that they can seriously challenge Google and Facebook.
The reality is that given that Google and Facebook make most of their money on their own sites, they will be hurt far less than competitive ad networks that work across multiple sites; that means that even more digital advertising money – which will continue to grow, regardless of regulation – will flow to Google and Facebook.
To be sure, that doesn’t mean regulation isn’t appropriate – it should be far more obvious to everyone that Russians were purchasing election-related ads on Facebook – but rather that it be expressly designed to limit the worst abuses and enable meaningful competitors, even if they accept payment in Russian Rubles.
For what it’s worth, Stratechery has never actually taken out a Facebook ad, or any ad for that matter [↩]Yes, I’m writing about Aggregation Theory again; I explain why I do so often here [↩]Presuming his tweet was not as cynical as it very well might have been [↩].

The orginal article.

Summary of “Facebook Faces a New World as Officials Rein In a Wild Web”

The scale of the Chinese government’s use of Facebook to communicate abroad offers a notable sign of Beijing’s understanding of Facebook’s power to mold public opinion.
They were finalizing plans, more than two years in the making, for WhatsApp, the messaging app Facebook had bought in 2014, to start sharing data on its one billion users with its new parent company.
A month after the new data-sharing deal started in August 2016, German privacy officials ordered WhatsApp to stop passing data on its 36 million local users to Facebook, claiming people did not have enough say over how it would be used.
The goal of European regulators, officials said, is to give users greater control over the data from social media posts, online searches and purchases that Facebook and other tech giants rely on to monitor our online habits.
As a tech company whose ad business requires harvesting digital information, Facebook has often underestimated the deep emotions that European officials and citizens have tied into the collection of such details.
On Sept. 12, Spain’s privacy agency fined the company 1.2 million euros for not giving people sufficient control over their data when Facebook collected it from third-party websites.
“Facebook simply can’t stick to a one-size-fits-all product around the world,” said Max Schrems, an Austrian lawyer who has been a Facebook critic after filing the case that eventually overturned the 15-year-old data deal.
“I prefer using Facebook because that’s where my customers are. The first thing people want to do when they buy a smartphone is to open a Facebook account.”

The orginal article.

Summary of “The Fake-News Fallacy”

On the evening of October 30, 1938, a seventy-six-year-old millworker in Grover’s Mill, New Jersey, named Bill Dock heard something terrifying on the radio.
After investigating, as a newspaper later reported, he “Didn’t see anybody he thought needed shooting.” In fact, he’d been duped by Orson Welles’s radio adaptation of “The War of the Worlds.” Structured as a breaking-news report that detailed the invasion in real time, the broadcast adhered faithfully to the conventions of news radio, complete with elaborate sound effects and impersonations of government officials, with only a few brief warnings through the program that it was fiction.
Newspapers wanted to show that radio was irresponsible and needed guidance from its older, more respectable siblings in the print media, such “Guidance” mostly taking the form of lucrative licensing deals and increased ownership of local radio stations.
The Nazi Ministry for Public Enlightenment and Propaganda deployed a force called the Funkwarte, or Radio Guard, that went block by block to insure that citizens tuned in to Hitler’s major broadcast speeches, as Tim Wu details in his new book, “The Attention Merchants.” Meanwhile, homegrown radio demagogues like Father Charles Coughlin and the charismatic Huey Long made some people wonder about a radio-aided Fascist takeover in America.
Although radio can seem like an unremarkable medium-audio wallpaper pasted over the most boring parts of your day-the historian David Goodman’s book “Radio’s Civic Ambition: American Broadcasting and Democracy in the 1930s” makes it clear that the birth of the technology brought about a communications revolution comparable to that of the Internet.
John Dewey called radio “The most powerful instrument of social education the world has ever seen.” Populist reformers demanded that radio be treated as a common carrier and give airtime to anyone who paid a fee.
A “Vague claim of exclusion” sharpened into a “Powerful and effective ideological arrow in the conservative quiver,” Hemmer argues, through battles that conservative radio broadcasters had with the F.C.C. in the nineteen-fifties and sixties.
A fund-raising letter for a prominent conservative radio show railed against the doctrine, calling it “The most dastardly collateral attack on freedom of speech in the history of the country.” Thus was born the character of the persecuted truthteller standing up to a tyrannical government-a trope on which a billion-dollar conservative-media juggernaut has been built.

The orginal article.

Summary of “We need to nationalise Google, Facebook and Amazon. Here’s why”

Ello’s rapid rise and fall is symptomatic of our contemporary digital world and the monopoly-style power accruing to the 21st century’s new “Platform” companies, such as Facebook, Google and Amazon.
The platform – an infrastructure that connects two or more groups and enables them to interact – is crucial to these companies’ power.
Platforms, as spaces in which two or more groups interact, provide what is in effect an oil rig for data.
Every interaction on a platform becomes another data point that can be captured and fed into an algorithm.
At the heart of platform capitalism is a drive to extract more data in order to survive.
Facebook is a master at using all sorts of behavioural techniques to foster addictions to its service: how many of us scroll absentmindedly through Facebook, barely aware of it?
Others have simply bought up smaller companies: Facebook has swallowed Instagram, WhatsApp, and Oculus, while investing in drone-based internet, e-commerce and payment services.
All the dynamics of platforms are amplified once AI enters the equation: the insatiable appetite for data, and the winner-takes-all momentum of network effects.

The orginal article.

Summary of “The Messy, Confusing Future of TV? It’s Here”

One, an ESPN-branded streaming sports service, will be available early next year, while the other, focusing on Disney movies and shows, will go live in 2019.A day later, Facebook announced Watch, a tab inside the main Facebook app that will soon host a slate of professionally produced video series.
The company says people will be able to enjoy premium fare like “Returning the Favor,” starring the “Dirty Jobs” host Mike Rowe; a reality show about tiny houses; and “Bae or Bail,” which Facebook describes thusly: “Unsuspecting couples put their relationship and wits to the test as they’re thrown into terrifying scenarios.”
Their theory is that the more time Facebook users spend watching video, the more ads they’ll see.
Facebook doesn’t have a huge library of popular content like Disney, but it does have a treasure trove of data about the personal tastes and preferences of its more than two billion registered users, and presumably plans to use that data to target ads at exactly the people companies want to reach.
More than $70 billion per year is spent on traditional television advertisements, and as that pile of money shifts to digital video, Facebook presumably wants to make sure a hefty chunk ends up in its own pockets.
On the surface, these seem like very different strategies – selling premium video to an existing audience of fans, versus giving away premium video in an effort to sell hyper-targeted ads and attract a network of amateurs.
There are already Netflix and Hulu, single-company services like CBS All Access, and “Skinny bundles” such as PlayStation Vue and Sling TV, not to mention the endless amateur video available from Facebook, Twitter and YouTube.
“One of the barriers to entry for the consumer right now is simply confusion,” said Paul Verna, the principal video analyst at eMarketer, a media research firm.

The orginal article.

Summary of “Facebook knew about Snap’s struggles months before the public”

This isn’t the first time Facebook has used Onavo’s app usage data to make major decisions.
The info reportedly influenced the decision to buy WhatsApp, as Facebook knew that WhatsApp’s dominance in some areas could cut it out of the loop.
To be clear, Facebook isn’t grabbing this data behind anyone’s back.
The revelation here is more about how Facebook uses that information rather than the collection itself.
Former Federal Trade Commission CTO Askhan Soltani tells the WSJ that Facebook is turning customers’ own data against them by using it to snuff out competitors.
Tech lawyer Adam Shevell is concerned that Facebook might be violating Apple’s App Store rules by collecting data that isn’t directly relevant to app use or ads.
No matter what, the news underscores just how hard it is for upstarts to challenge Facebook’s dominant position.
How do you compete with an internet giant that can counter your app’s features the moment it becomes popular? This doesn’t make Facebook immune to competition, but app makers definitely can’t assume that they’ll catch the firm off-guard.

The orginal article.