Summary of “Quitting Instagram: She’s one of the millions disillusioned with social media. But she also helped create it.”

Richardson isn’t a bystander reckoning with the ills of technology: She was one of the 13 original employees working at Instagram in 2012 when Facebook bought the viral photo-sharing app for $1 billion.
With their exit, Richardson and other former Instagram employees worried Facebook would squash whatever independent identity the company had managed to retain.
Three of the early Instagram employees, including Richardson, have deleted it – permanently or periodically, comparing it to a drug that produces a diminishing high.
Ian Spalter, Instagram’s head of design, said in an interview that experiences on Instagram are subjective – one person’s frustration may be another person’s pleasure – and that the app was not designed to be a time-suck.
Three of the original 13 employees are still at Instagram or Facebook, according to Facebook.
Spalter, the Instagram design chief, pointed out that Instagram’s rapid growth has required the company to build tools that will assist people in finding posts and users.
Instagram is aware that its software was offering up too much celebrity content and content from people with large followings at the expense of posts from people who users know personally, according to Spalter, who joined Instagram in 2015.
She called up a friend from her Instagram days, and they concluded that Instagram no longer had value in their lives.

The orginal article.

Summary of “Big tech must not reframe digital ethics in its image – TechCrunch”

Ethics are needed to fill the gaps where new uses of data keep pushing in.
So the Facebook founder seized on the conference’s discussion topic of big data ethics and tried to zoom right back out again.
The ‘we’re not perfect and have lots more to learn’ line that also came from both CEOs seems mostly intended to manage regulatory expectation vis-a-vis data protection – and indeed on the wider ethics front.
The growing public and political alarm over how big data platforms stoke addiction and exploit people’s trust and information – and the idea that an overarching framework of not just laws but digital ethics might be needed to control this stuff – dovetails neatly with the alternative track that Apple has been pounding for years.
Though only a handful of tech giants have built unchallengeably massive tracking empires via the systematic exploitation of other people’s data.
“You have to do your homework as a company to think about fairness,” said Elizabeth Denham, when asked ‘who decides what’s fair’ in a data ethics context.
The closed session of the conference produced a declaration on ethics and data in artificial intelligence – setting out a list of guiding principles to act as “Core values to preserve human rights” in the developing AI era – which included concepts like fairness and responsible design.
The consensus from the event is it’s not only possible but vital to engineer ethics into system design from the start whenever you’re doing things with other people’s data.

The orginal article.

Summary of “Meet the 23-year-old engineering detective behind the biggest leaks in tech”

Jane Wong is a 23-year-old studying software engineering at UMass Dartmouth, currently taking a gap year in Hong Kong.
Scroll through Wong’s Twitter, and you’ll see feature after feature, leak after leak – Gboard’s new Material Design for search cards, Facebook’s notification page is getting a redesign, Messenger is getting a dark mode.
There are many reverse engineer hobbyists out there, but Wong works independently, and for no money.
Wong reverse engineers apps to find out what features tech giants have been testing recently Most famously, she was the first to leak Facebook’s dating feature – publications such as Engadget, TechCrunch, and Verge were all fast to pick up the story behind her.
While Wong remains relatively low-key in the tech community – her Twitter followers are around 6k – this can open her up to others trying to steal her work: “There were a few incidents where I caught individuals plagiarizing/freebooting my scoops freebooting as in downloading the screenshots and reposting it, with little to no citations, and without adding any additional contribution on top of it.”
Wong doesn’t let it dissuade her from continuing her work: “I think there are better things to focus on than those individuals, but I called some out on Twitter. Some would add only the minimum amount of citation possible, and then tell people to follow their profiles or join their Facebook Groups for scoops they did not find.”
Wong is currently still a student, but when she graduates she hopes to find a job at one of the platforms she reverse-engineers.
It may be just a hobby, but reverse engineers like Wong are changing the tech industry.

The orginal article.

Summary of “Here’s How A Handful Of American Tech Companies Radicalized The World”

Why is an American company like Facebook placing ads in newspapers in countries like India, Italy, Mexico, and Brazil, explaining to local internet users how to look out for abuse and misinformation? Because our lives, societies, and governments have been tied into invisible feedback loops, online and off.
The election is seen as the country’s first Facebook election.
Two months prior, the company declares the gun-toting former mayor the “Undisputed king of Facebook conversations.” A cast of far-right internet celebrities begin creating an ad hoc propaganda network around him.
It’s June 2018 and British far-right influencer named Tommy Robinson is in jail after going live on Facebook outside Leeds Crown Court, violating British contempt of court laws.
Inside, López Obrador says, “The transformation we will carry out will basically consist on kicking out corruption from our country.” Online, thousands of bots are pusing pro-AMLO trending topics on Twitter and flooding Facebook newsfeeds with fake news about the new president.
In a Facebook video several days later, that if he becomes president, he aims to change a rule created by WhatsApp that limits the number of simultaneous messages a user can send at once.
Your trolls will probably have been radicalized online via some kind of community for young men like #GamerGate, jeuxvideo.com in France, ForoCoches in Spain, Ilbe Storehouse in South Korea, 2chan in Japan, or banter Facebook pages in the UK. Then far-right influencers start appearing, aided by algorithms recommending content that increases user watch-time.
Alex Stamos, Facebook’s former chief security officer, published a piece in August saying that it was already too late for Facebook to protect the 2018 US midterm elections from misinformation campaigns from Russia and Iran.

The orginal article.

Summary of “Facebook Is Full of Emotional-Support Groups”

It’s not surprising that Facebook has turned into a gathering place for strangers sharing their deepest secrets.
Emotional-support groups have sprung up around topics broad and narrow: diabetes, addiction, egg donation, a specific birth-control device now pulled from the U.S. market, parenting children who might grow up to be psychopaths, rare diseases that affect only a few dozen patients in the whole world.
The internet has always promised to connect people by common interest rather than geography, and with its 2-billion-user base, Facebook is where those connections are often being made.
“For people searching for support, [Facebook] is a one-stop shop,” says Andrea Downing, a moderator for BRCA Sisterhood, a support group for women who have tested positive for breast-cancer mutations.
Might is a member of multiple Facebook groups for NGLY1 and related diseases, where members support one another through health crises and share hard-won medical information about the rare disease.
Since Facebook has pivoted to groups, it has added several tools for group admins, including ways to filter membership requests and delete content from banned members.
Most important, perhaps, it made the membership of closed groups private.
Last year, Catherine St Clair decided to start a support group for people whose DNA tests revealed unexpected biological parents, after meeting another woman in the same situation on Facebook.

The orginal article.

Summary of “The Problem with Facebook and Virtual Reality – Stratechery by Ben Thompson”

No one plans to visit Facebook: who among us has “Facebook Time” set on our calendar? And yet the vast majority of people who are able – over 2 billion worldwide – visit Facebook every single day, for minutes at a time.
More striking is Zuckerberg’s evaluation that Facebook was now in a position to focus elsewhere: after the revelations of state-sponsored interference and legitimate questions about Facebook’s impact on society broadly it seems rather misguided.
The problem for Facebook is that the fundamental nature of the company – not to mention Zuckerberg’s platform ambitions – rely on serving as many customers as possible.
What is inevitable though – what was always inevitable, from the day Facebook bought Oculus – is that this will be one acquisition Facebook made that was a mistake.
If Facebook wanted a presence in virtual reality the best possible route was the same it took in mobile: to be an app-exposed service, available on all devices, funded by advertising.
Make no mistake, Zuckerberg gave an impressive demo of what can happen when Facebook controls your eyes in virtual reality; what concerns me is the real world results of Facebook controlling everyone’s attention with the sole goal of telling each of us what we want to hear.
Again Facebook aside, virtual reality is more compelling than you might think.
To that end, you can be sure that any Facebook executive would be happy to explain why virtual reality and Oculus is a step in that direction.

The orginal article.

Summary of “The Facebook-Driven Video Push May Have Cost 483 Journalists Their Jobs”

First was Upworthy, once a beneficiary of Facebook’s algorithmic largesse, which rang in 2016 with 14 layoffs, part of a move into “Original video content.” Four months later, Mashable laid off 30 employees in a pivot to “Non-news video content.” That November, Fusion laid off 70 people, in part because big bets within social video did not generate enough revenue.
In February 2017, Thrillist’s parent company let more than 20 people go, but was “Continuing to dream in video.” In June 2017, Vocativ laid off 20 editorial staff members “In an organizational shift to an entirely video-first strategy.” Later that month, MTV News laid off at least nine employees and freelancers, “With an eye toward creating more video.” Fox Sports also released 20 writers and editors on the same day, “Replacing them with a similar number of jobs in video.” The next month, Vice fired 60 employees while promising to focus on video production.
In August, Mic dismissed 25 people from its news and editorial departments to refocus on “New mixed-media formats in social video.”
That month, Digiday reported that a “Side effect of the pivot to video” was “Audience shrinkage,” citing similar declines at Mic and Vocativ.
“Facebook does not offer a viable path to monetize our in-depth video work,” its chief executive lamented in a memo.
CNN Digital eliminated “Fewer than 50” positions, including in the video department.
Significantly, this number doesn’t include local newspapers that dropped staff while chasing video dollars.
Media executives ultimately made these decisions, and journalism was an unstable industry long before the first Facebook video.

The orginal article.

Summary of “The Facebook-Driven Video Push May Have Cost 483 Journalists Their Jobs”

First was Upworthy, once a beneficiary of Facebook’s algorithmic largesse, which rang in 2016 with 14 layoffs, part of a move into “Original video content.” Four months later, Mashable laid off 30 employees in a pivot to “Non-news video content.” That November, Fusion laid off 70 people, in part because big bets within social video did not generate enough revenue.
In February 2017, Thrillist’s parent company let more than 20 people go, but was “Continuing to dream in video.” In June 2017, Vocativ laid off 20 editorial staff members “In an organizational shift to an entirely video-first strategy.” Later that month, MTV News laid off at least nine employees and freelancers, “With an eye toward creating more video.” Fox Sports also released 20 writers and editors on the same day, “Replacing them with a similar number of jobs in video.” The next month, Vice fired 60 employees while promising to focus on video production.
In August, Mic dismissed 25 people from its news and editorial departments to refocus on “New mixed-media formats in social video.”
That month, Digiday reported that a “Side effect of the pivot to video” was “Audience shrinkage,” citing similar declines at Mic and Vocativ.
“Facebook does not offer a viable path to monetize our in-depth video work,” its chief executive lamented in a memo.
CNN Digital eliminated “Fewer than 50” positions, including in the video department.
Significantly, this number doesn’t include local newspapers that dropped staff while chasing video dollars.
Media executives ultimately made these decisions, and journalism was an unstable industry long before the first Facebook video.

The orginal article.

Summary of “I fell for Facebook fake news. Here’s why millions of you did, too.”

Fake news creators “Aren’t loyal to any one ideology or geography,” said Tessa Lyons, the product manager for Facebook’s News Feed tasked with reducing misinformation.
Thanks in part to those efforts, independent fact-checkers and some new technologies, Facebook user interaction with known fake news sites has declined by 50 percent since the 2016 election, according to a study by Stanford and New York University.
The crazy plane video first appeared Sept. 13 on a Facebook page called Time News International.
Why would someone turn Tsirbas’ airplane video into a fake news report?
The Time News International page doesn’t regularly link to outside articles, though it posts a lot of outrageous photos and videos about topics in the news.
Facebook’s response to the plane video shows how far it’s come in the fight with fake news – and how far we have to go.
On Sept. 17, a few days after it was posted, the video was detected by Facebook’s machine-learning systems, programs that try to automatically detect fake news.
After Snopes labeled it as “False,” Facebook made it show up less often in News Feeds.

The orginal article.