TIP: Click on subject to list as thread! ANSI
echo: coffee_klatsch
to: All
from: Roger Nelson
date: 2019-04-11 12:14:22
subject: FB Part 2

* Copied (from: COFFEE_KLATSCH) by Roger Nelson using timEd/386 1.10.y2k+.

They rolled out end-to-end encryption and made it happen for a billion
people in WhatsApp," Pfefferkorn said. "It's not necessarily
impossible."
WhatsApp's past is now Facebook's future
 
In looking to the future, Zuckerberg first looks back.
 
To lend some authenticity to this new-and-improved private Facebook,
Zuckerberg repeatedly invokes a previously-acquired company's reputation to
bolster Facebook's own.
 
WhatsApp, Zuckerberg said, should be the model for the all new Facebook.
 
"We plan to build this [privacy-focused platform] the way we've
developed WhatsApp: focus on the most fundamental and private use
case-messaging-make it as secure as possible, and then build more ways for
people to interact on top of that," Zuckerberg said.
 
The secure messenger, which Facebook purchased in 2014 for $19 billion, is
a privacy exemplar. It developed default end-to-end encryption for users in
2016 (under Facebook's stead), refuses to store keys to grant access to
users' messages, and tries to limit user data collection as much as
possible.
 
Still, several users believed that WhatsApp joining Facebook represented a
death knell for user privacy. One month after the sale, WhatsApp's
co-founder Jan Kaum tried to dispel any misinformation about WhatsApp's
compromised vision.
 
"If partnering with Facebook meant that we had to change our values,
we wouldn't have done it," Kaum wrote.
 
Four years after the sale, something changed.
 
Kaum left Facebook in March 2018, reportedly troubled by Facebook's
approach to privacy and data collection. Kaum's departure followed that of
his co-founder Brian Acton the year before.
 
In an exclusive interview with Forbes, Acton explained his decision to
leave Facebook. It was, he said, very much about privacy.
 
"I sold my users' privacy to a larger benefit," Acton said.
"I made a choice and a compromise. And I live with that every
day."
 
Strangely, in defending Facebook's privacy record, Zuckerberg avoids a
recent pro-encryption episode. Last year, Facebook fought-and
prevailed-against a US government request to reportedly "break the
encryption" in its Facebook Messenger app. Zuckerberg also neglects to
mention Facebook's successful roll-out of optional end-to-end encryption in
its Messenger app.
 
Further, relying so heavily on WhatsApp as a symbol of privacy is tricky.
After all, Facebook didn't purchase the company because of its philosophy.
Facebook purchased WhatsApp because it was a threat.
Facebook's history of missed promises
 
Zuckerberg's statement promises users an entirely new Facebook, complete
with end-to-end encryption, ephemeral messages and posts, less intrusive,
permanent data collection, and no data storage in countries that have
abused human rights.
 
These are strong ideas. End-to-end encryption is a crucial security measure
for protecting people's private lives, and Facebook's promise to refuse to
store encryption keys only further buttresses that security. Ephemeral
messages, posts, photos, and videos give users the opportunity to share
their lives on their own terms. Refusing to put data in known
human-rights-abusing regimes could represent a potentially significant
market share sacrifice, giving Facebook a chance to prove its commitment to
user privacy.
 
But Facebook's promise-keeping record is far lighter than its
promise-making record. In the past, whether Facebook promised a new product
feature or better responsibility to its users, the company has repeatedly
missed its own mark.
 
In April 2018, TechCrunch revealed that, as far back as 2010, Facebook
deleted some of Zuckerberg's private conversations and any record of his
participation-retracting his sent messages from both his inbox and from the
inboxes of his friends. The company also performed this deletion, which is
unavailable to users, for other executives.
 
Following the news, Facebook announced a plan to give its users an
"unsend" feature.
 
But nearly six months later, the company had failed to deliver its promise.
It wasn't until February of this year that Facebook produced a
half-measure: instead of giving users the ability to actually delete sent
messages, like Facebook did for Zuckerberg, users could "unsend"
an accidental message on the Messenger app within 10 minutes of the initial
sending time.
 
Gizmodo labeled it a "bait-and-switch."
 
In October 2016, ProPublica purchased an advertisement in Facebook's
"housing categories" that excluded groups of users who were
potentially African-American, Asian American, or Hispanic. One civil rights
lawyer called this exclusionary function "horrifying."
 
Facebook quickly promised to improve its advertising platform by removing
exclusionary options for housing, credit, and employment ads, and by
rolling out better auto-detection technology to stop potentially
discriminatory ads before they published.
 
One year later, in November 2017, ProPublica ran its experiment again.
Discrimination, again, proved possible. The anti-discriminatory tools
Facebook announced the year earlier caught nothing.
 
"Every single ad was approved within minutes," the article said.
 
This time, Facebook shut the entire functionality down, according to a
letter from Chief Operating Officer Sheryl Sandberg to the Congressional
Black Caucus. (Facebook also announced the changes on its website.)
 
More recently, Facebook failed to deliver on a promise that users' phone
numbers would be protected from search. Today, through a strange
workaround, users can still be "found" through the phone number
that Facebook asked them to provide specifically for two-factor
authentication.
 
Away from product changes, Facebook has repeatedly told users that it would
commit itself to user safety, security, and privacy. The actual track
record following those statements tells a different story, though.
 
In 2013, an Australian documentary filmmaker met with Facebook's public
policy and communications lead and warned him of the rising hate speech
problem on Facebook's platform in Myanmar. The country's ultranationalist
Buddhists were making false, inflammatory posts about the local Rohingya
Muslim population, sometimes demanding violence against them. Riots had
taken 80 people's lives the year before, and thousands of Rohingya were
forced into internment camps.
 
Facebook's public policy and communications lead, Elliot Schrage, sent the
Australian filmmaker, Aela Callan, down a dead end.
 
"He didn't connect me to anyone inside Facebook who could deal with
the actual problem," Callan told Reuters.
 
By November 2017, the problem had exploded, with Myanmar torn and its
government engaging in what the United States called "ethnic
cleansing" against the Rohingya. In 2018, investigators from the
United Nations placed blame on Facebook.
 
"I'm afraid that Facebook has now turned into a beast," said one
investigator.
 
During the years before, Facebook made no visible effort to fix the
problem. By 2015, the company employed just two content moderators who
spoke Burmese-the primary language in Myanmar. By mid-2018, the company's
content reporting tools were still not translated into Burmese,
handicapping the population's ability to protect itself online. Facebook
had also not hired a single employee in Myanmar at that time.
 
In April 2018, Zuckerberg promised to do better. Four months later, Reuters
discovered that hate speech still ran rampant on the platform and that
hateful posts as far back as six years had not been removed.
 
The international crises continued.
 
In March 2018, The Guardian revealed that a European data analytics company
had harvested the Facebook profiles of tens of millions of users. This was
the Cambridge Analytica scandal, and, for the first time, it directly
implicated Facebook in an international campaign to sway the US
presidential election.
 
Buffeted on all sides, Facebook released . an ad campaign. Drenched in
sentimentality and barren of culpability, a campaign commercial vaguely
said that "something happened" on Facebook: "spam,
clickbait, fake news, and data misuse."
 
"That's going to change," the commercial promised. "From now
on, Facebook will do more to keep you safe and protect your privacy."
 
Here's what happened since that ad aired in April 2018.
 
The New York Times revealed that, throughout the past 10 years, Facebook
shared data with at least 60 device makers, including Apple, Samsung,
Amazon, Microsoft, and Blackberry. The New York Times also published an
investigatory bombshell into Facebook's corporate culture, showing that,
time and again, Zuckerberg and Sandberg responded to corporate crises with
obfuscation, deflection, and, in the case of one transparency-focused
project, outright anger.
 
A British parliamentary committee released documents that showed how
Facebook gave some companies, including Airbnb and Netflix, access to its
platform in exchange for favors. (More documents released this year showed
prior attempts by Facebook to sell user data.) Facebook's Onava app got
kicked off the Apple app store for gathering user data. Facebook also
reportedly paid users as young as 13-years-old to install the
"Facebook Research" app on their own devices, an app intended
strictly for Facebook employee use.
 
Oh, and Facebook suffered a data breach that potentially affected up to 50
million users.
 
While the substance of Zuckerberg's promises could protect user privacy,
the execution of those promises is still up in the air. It's not that users
don't want what Zuckerberg is describing-it's that they're burnt out on
him. How many times will they be forced to hear about another change of
heart before Facebook actually changes for good?
 
Tomorrow's Facebook
 
Changing the direction of a multibillion-dollar, international company is
tough work, though several experts sound optimistic about Zuckerberg's
privacy roadmap. But just as many experts have depleted their faith in the
company. If anything, Facebook's public pressures might be at their
lowest-detractors have removed themselves from the platform entirely, and
supporters will continue to dig deep into their own good will.
 
What Facebook does with this opportunity is entirely under its own control.
Users around the world will be better off if the company decides that, this
time, it's serious about change. User privacy is worth the effort.
 
 
Regards,
 
Roger

--- D'Bridge (SR41)
* Origin: NCS BBS - Houma, LoUiSiAna (1:3828/7)
SEEN-BY: 1/19 15/2 16/0 123/130 131 1970 203/0 221/0 1 6 360 226/16 17 229/107
SEEN-BY: 229/354 426 452 1014 230/0 240/5832 5853 249/206 317 400 261/38
SEEN-BY: 280/5003 5006 317/3 320/119 219 322/757 342/200 393/68 633/267
SEEN-BY: 640/1321 1384 712/620 848 770/1 3828/7 12
@PATH: 3828/7 229/426 240/5832 320/219 221/1 640/1384 712/848 633/267

SOURCE: echomail via fidonet.ozzmosis.com

Email questions or comments to sysop@ipingthereforeiam.com
All parts of this website painstakingly hand-crafted in the U.S.A.!
IPTIA BBS/MUD/Terminal/Game Server List, © 2025 IPTIA Consulting™.