Why Zuckerbergs 14-Year Apology Tour Hasnt Fixed Facebook

In 2003, one year before Facebook was founded, a website called Facemash began nonconsensually scraping pictures of students at Harvard from the school’s intranet and asking users to rate their hotness. Obviously, it caused an outcry. The website’s developer quickly proffered an apology. "I hope you understand, this is not how I meant for things to go, and I apologize for any harm done as a result of my neglect to consider how quickly the site would spread and its consequences thereafter,” wrote a young Mark Zuckerberg. “I definitely see how my intentions could be seen in the wrong light.”

In 2004 Zuckerberg cofounded Facebook, which rapidly spread from Harvard to other universities. And in 2006 the young company blindsided its users with the launch of News Feed, which collated and presented in one place information that people had previously had to search for piecemeal. Many users were shocked and alarmed that there was no warning and that there were no privacy controls. Zuckerberg apologized. “This was a big mistake on our part, and I'm sorry for it,” he wrote on Facebook’s blog. "We really messed this one up," he said. "We did a bad job of explaining what the new features were and an even worse job of giving you control of them."

Zeynep Tufekci (@zeynep) is an associate professor at the University of North Carolina and an opinion writer for The New York Times. She recently wrote about the (democracy-poisoning) golden age of free speech.

Then in 2007, Facebook’s Beacon advertising system, which was launched without proper controls or consent, ended up compromising user privacy by making people’s purchases public. Fifty thousand Facebook users signed an e-petition titled “Facebook: Stop invading my privacy.” Zuckerberg responded with an apology: “We simply did a bad job with this release and I apologize for it." He promised to improve. “I'm not proud of the way we've handled this situation and I know we can do better,” he wrote.

By 2008, Zuckerberg had written only four posts on Facebook’s blog: Every single one of them was an apology or an attempt to explain a decision that had upset users.

In 2010, after Facebook violated users' privacy by making key types of information public without proper consent or warning, Zuckerberg again responded with an apology—this time published in an op-ed in The Washington Post. “We just missed the mark,” he said. “We heard the feedback,” he added. “There needs to be a simpler way to control your information.” “In the coming weeks, we will add privacy controls that are much simpler to use,” he promised.

I’m going to run out of space here, so let’s jump to 2018 and skip over all the other mishaps and apologies and promises to do better—oh yeah, and the consent decree that the Federal Trade Commission made Facebook sign in 2011, charging that the company had deceptively promised privacy to its users and then repeatedly broken that promise—in the intervening years.

Last month, Facebook once again garnered widespread attention with a privacy related backlash when it became widely known that, between 2008 and 2015, it had allowed hundreds, maybe thousands, of apps to scrape voluminous data from Facebook users—not just from the users who had downloaded the apps, but detailed information from all their friends as well. One such app was run by a Cambridge University academic named Aleksandr Kogan, who apparently siphoned up detailed data on up to 87 million users in the United States and then surreptitiously forwarded the loot to the political data firm Cambridge Analytica. The incident caused a lot of turmoil because it connects to the rolling story of distortions in the 2016 US presidential election. But in reality, Kogan’s app was just one among many, many apps that amassed a huge amount of information in a way most Facebook users were completely unaware of.

At first Facebook indignantly defended itself, claiming that people had consented to these terms; after all, the disclosures were buried somewhere in the dense language surrounding obscure user privacy controls. People were asking for it, in other words.

But the backlash wouldn’t die down. Attempting to respond to the growing outrage, Facebook announced changes. “It’s Time to Make Our Privacy Tools Easier to Find”, the company announced without a hint of irony—or any other kind of hint—that Zuckerberg had promised to do just that in the “coming few weeks” eight full years ago. On the company blog, Facebook’s chief privacy editor wrote that instead of being “spread across nearly 20 different screens” (why were they ever spread all over the place?), the controls would now finally be in one place.

Zuckerberg again went on an apology tour, giving interviews to The New York Times, CNN, Recode, WIRED, and Vox (but not to the Guardian and Observer reporters who broke the story). In each interview he apologized. “I’m really sorry that this happened,” he told CNN. “This was certainly a breach of trust.”

But Zuckerberg didn’t stop at an apology this time. He also defended Facebook as an “idealistic company” that cares about its users and spoke disparagingly about rival companies that charge users money for their products while maintaining a strong record in protecting user privacy. In his interview with Vox’s Ezra Klein, Zuckerberg said that anyone who believes Apple cares more about users than Facebook does has “Stockholm syndrome”—the phenomenon whereby hostages start sympathizing and identifying with their captors.

This is an interesting argument coming from the CEO of Facebook, a company that essentially holds its users' data hostage. Yes, Apple charges handsomely for its products, but it also includes advanced encryption hardware on all its phones, delivers timely security updates to its whole user base, and has largely locked itself out of user data—to the chagrin of many governments, including that of the United States, and of Facebook itself.

Most Android phones, by contrast, gravely lag behind in receiving security updates, have no specialized encryption hardware, and often handle privacy controls in a way that is detrimental to user interests. Few governments or companies complain about Android phones. After the Cambridge Analytica scandal, it came to light that Facebook had been downloading and keeping all the text messages of its users on the Android platform—their content as well as their metadata. “The users consented!” Facebook again cried out. But people were soon posting screenshots that showed how difficult it was for a mere mortal to discern that’s what was going on, let alone figure out how to opt out, on the vague permission screen that flashed before users.

On Apple phones, however, Facebook couldn’t harvest people’s text messages because the permissions wouldn’t allow it.

In the same interview, Zuckerberg took wide aim at the oft-repeated notion that, if an online service is free, you—the user—are the product. He said that he found the argument that “if you’re not paying that somehow we can’t care about you, to be extremely glib and not at all aligned with the truth.” His rebuttal to that accusation, however, was itself glib; and as for whether it was aligned with the truth—well, we just have to take his word for it. “To the dissatisfaction of our sales team here,” he said, “I make all of our decisions based on what’s going to matter to our community and focus much less on the advertising side of the business.”

As far as I can tell, not once in his apology tour was Zuckerberg asked what on earth he means when he refers to Facebook’s 2 billion-plus users as “a community” or “the Facebook community.” A community is a set of people with reciprocal rights, powers, and responsibilities. If Facebook really were a community, Zuckerberg would not be able to make so many statements about unilateral decisions he has made—often, as he boasts in many interviews, in defiance of Facebook’s shareholders and various factions of the company’s workforce. Zuckerberg’s decisions are final, since he controls all the voting stock in Facebook, and always will until he decides not to—it’s just the way he has structured the company.

This isn’t a community; this is a regime of one-sided, highly profitable surveillance, carried out on a scale that has made Facebook one of the largest companies in the world by market capitalization.

Facebook’s 2 billion users are not Facebook’s “community.” They are its user base, and they have been repeatedly carried along by the decisions of the one person who controls the platform. These users have invested time and money in building their social networks on Facebook, yet they have no means to port the connectivity elsewhere. Whenever a serious competitor to Facebook has arisen, the company has quickly copied it (Snapchat) or purchased it (WhatsApp, Instagram), often at a mind-boggling price that only a behemoth with massive cash reserves could afford. Nor do people have any means to completely stop being tracked by Facebook. The surveillance follows them not just on the platform, but elsewhere on the internet—some of them apparently can’t even text their friends without Facebook trying to snoop in on the conversation. Facebook doesn’t just collect data itself; it has purchased external data from data brokers; it creates “shadow profiles” of nonusers and is now attempting to match offline data to its online profiles.

Again, this isn’t a community; this is a regime of one-sided, highly profitable surveillance, carried out on a scale that has made Facebook one of the largest companies in the world by market capitalization.

There is no other way to interpret Facebook’s privacy invading moves over the years—even if it’s time to simplify! finally!―as anything other than decisions driven by a combination of self-serving impulses: namely, profit motives, the structural incentives inherent to the company’s business model, and the one-sided ideology of its founders and some executives. All these are forces over which the users themselves have little input, aside from the regular opportunity to grouse through repeated scandals. And even the ideology—a vague philosophy that purports to prize openness and connectivity with little to say about privacy and other values—is one that does not seem to apply to people who run Facebook or work for it. Zuckerberg buys houses surrounding his and tapes over his computer’s camera to preserve his own privacy, and company employees went up in arms when a controversial internal memo that made an argument for growth at all costs was recently leaked to the press—a nonconsensual, surprising, and uncomfortable disclosure of the kind that Facebook has routinely imposed upon its billions of users over the years.

This isn’t to say Facebook doesn’t provide real value to its users, even as it locks them in through network effects and by crushing, buying, and copying its competition. I wrote a whole book in which I document, among other things, how useful Facebook has been to anticensorship efforts around the world. It doesn’t even mean that Facebook executives make all decisions merely to increase the company valuation or profit, or that they don’t care about users. But multiple things can be true at the same time; all of this is quite complicated. And fundamentally, Facebook’s business model and reckless mode of operating are a giant dagger threatening the health and well-being of the public sphere and the privacy of its users in many countries.

So, here’s the thing. There is indeed a case of Stockholm syndrome here. There are very few other contexts in which a person would be allowed to make a series of decisions that have obviously enriched them while eroding the privacy and well-being of billions of people; to make basically the same apology for those decisions countless times over the space of just 14 years; and then to profess innocence, idealism, and complete independence from the obvious structural incentives that have shaped the whole process. This should ordinarily cause all the other educated, literate, and smart people in the room to break into howls of protest or laughter. Or maybe tears.

Facebook has tens of thousands of employees, and reportedly an open culture with strong internal forums. Insiders often talk of how free employees feel to speak up, and indeed I’ve repeatedly been told how they are encouraged to disagree and discuss all the key issues. Facebook has an educated workforce.

By now, it ought to be plain to them, and to everyone, that Facebook’s 2 billion-plus users are surveilled and profiled, that their attention is then sold to advertisers and, it seems, practically anyone else who will pay Facebook—including unsavory dictators like the Philippines’ Rodrigo Duterte. That is Facebook’s business model. That is why the company has an almost half-a-trillion-dollar market capitalization, along with billions in spare cash to buy competitors.

These are such readily apparent facts that any denial of them is quite astounding.

And yet, it appears that nobody around Facebook’s sovereign and singular ruler has managed to convince their leader that these are blindingly obvious truths whose acceptance may well provide us with some hints of a healthier way forward. That the repeated word of the use “community” to refer Facebook’s users is not appropriate and is, in fact, misleading. That the constant repetition of “sorry” and “we meant well” and “we will fix it this time!” to refer to what is basically the same betrayal over 14 years should no longer be accepted as a promise to do better, but should instead be seen as but one symptom of a profound crisis of accountability. When a large chorus of people outside the company raises alarms on a regular basis, it’s not a sufficient explanation to say, “Oh we were blindsided (again).”

Maybe, just maybe, that is the case of Stockholm syndrome we should be focusing on.

Zuckerberg’s outright denial that Facebook’s business interests play a powerful role in shaping its behavior doesn’t bode well for Facebook’s chances of doing better in the future. I don’t doubt that the company has, on occasion, held itself back from bad behavior. That doesn’t make Facebook that exceptional, nor does it excuse its existing choices, nor does it alter the fact that its business model is fundamentally driving its actions.

At a minimum, Facebook has long needed an ombudsman’s office with real teeth and power: an institution within the company that can act as a check on its worst impulses and to protect its users. And it needs a lot more employees whose task is to keep the platform healthier. But what would truly be disruptive and innovative would be for Facebook to alter its business model. Such a change could come from within, or it could be driven by regulations on data retention and opaque, surveillance-based targeting—regulations that would make such practices less profitable or even forbidden.

Facebook will respond to the latest crisis by keeping more of its data within its own walls (of course, that fits well with the business of charging third parties for access to users based on extensive profiling with data held by Facebook, so this is no sacrifice). Sure, it’s good that Facebook is now promising not to leak user data to unscrupulous third parties; but it should finally allow truly independent researchers better (and secure, not reckless) access to the company’s data in order to investigate the true effects of the platform. Thus far, Facebook has not cooperated with independent researchers who want to study it. Such investigation would be essential to informing the kind of political discussion we need to have about the trade-offs inherent in how Facebook, and indeed all of social media, operate.

Even without that independent investigation, one thing is clear: Facebook’s sole sovereign is neither equipped to, nor should he be in a position to, make all these decisions by himself, and Facebook’s long reign of unaccountability should end.


Facebook in Crisis

  • Initially, Facebook said that Cambridge Analytica got unauthorized access to some 50 million users' data. The social network has now raised that number to 87 million.
  • Next week, Mark Zuckerberg will testify before Congress. The question on our minds: How can Facebook prevent the next crisis if its guiding principle is and always has been connection at all cost?
  • Facebook has a long history of privacy gaffes. Here are just a few.

Photograph by WIRED/Getty Images

Read more: https://www.wired.com/story/why-zuckerberg-15-year-apology-tour-hasnt-fixed-facebook/

Facebook drops no-vote stock plan, Zuck will sell shares to fund philanthropy

Mark Zuckerberg has gotten so rich that he can fund his philanthropic foundation and retain voting control without Facebook having to issue a proposed non-voting class of stock that faced shareholder resistance. Today Facebook announced that it’s withdrawn its plan to issue Class C no-vote stock and has resolved the shareholder lawsuit seeking to block the corporate governance overhaul.

Instead, Zuckerberg says that because Facebook has become so valuable, he can sell a smaller allotment of his stake in the company to deliver plenty of capital to his Chan Zuckerberg Initiative foundation that aims to help eradicate disease and deliver personalized education to all children.

“Over the past year and a half, Facebook’s business has performed well and the value of our stock has grown to the point that I can fully fund our philanthropy and retain voting control of Facebook for 20 years or more,” Zuckerberg writes. Facebook’s share price has increased roughly 45 percent, from $117 to $170, since the Class C stock plan was announced, with Facebook now valued at $495 billion.

Mark Zuckerberg, Priscilla Chan and their daughters Max and August

“We are gratified that Facebook and Mr. Zuckerberg have agreed not to proceed with the reclassification we were challenging,” writes Lee Rudy, the partner at Kessler Topaz Meltzer & Check LLP that was representing the plaintiffs in the lawsuit seeking to block the no-vote share creation. Zuckerberg was slated to testify in the suit later this month, but now won’t have to. “This result is a full victory for Facebook’s stockholders, and achieved everything we could have hoped to obtain by winning a permanent injunction at trial.”

“I want to be clear: this doesn’t change Priscilla and my plans to give away 99% of our Facebook shares during our lives. In fact, we now plan to accelerate our work and sell more of those shares sooner,” Zuckerberg wrote. “I anticipate selling 35-75 million Facebook shares in the next 18 months to fund our work in education, science, and advocacy.” That equates to $5.95 billion to $12.75 billion worth of Facebook shares Zuckerberg will liquidate.

When Zuckerberg announced the plan in April 2016, he wrote that being a founder-led company where he controls enough votes to always steer Facebook’s direction rather than cowing to public shareholders lets Facebook “resist the short term pressures that often hurt companies.” By issuing the non-voting shares, “I’ll be able to keep founder control of Facebook so we can continue to build for the long term, and Priscilla and I will be able to give our money to fund important work sooner.”

A spokesperson for the Chan Zuckerberg Initiative told TechCrunch that this outcome is very good for the foundation, because it provides more predictability to its funding. The plan will also allow Zuckerberg to deliver cash to the CZI sooner, which its new CFO Peggy Alford will be able to allocate between its health, education and advocacy projects.

With the new plan to sell shares, it’s unclear what might happen to Zuckerberg’s iron grip on Facebook’s future in “20 years or more.”

Dropping the Class C shares plan may be seen as a blow to Facebook board member Marc Andreessen, who Bloomberg revealed had coached Zuckerberg through pushing the proposed plan through the rest of the board. But given Zuckerberg’s power, Andreessen is unlikely to be ousted unless the Facebook CEO wants him gone.

Zuckerberg strolls through the developer conference of Oculus, the VR company he pushed Facebook to acquire

For the foreseeable future, though, Zuckerberg will have the power to shape Facebook’s decisions. His business instincts have proven wise over the years. Acquisitions he orchestrated that seemed pricey at first — like Instagram and WhatsApp — have been validated as their apps grow to multiples of their pre-buy size. And Zuckerberg’s relentless prioritization of the user experience over that of advertisers and outside developers has kept the Facebook community deeply engaged instead of pushed away with spam.

Zuckerberg’s ability to maintain power could allow him to continue to make bold or counter-intuitive decisions without shareholder interference. But the concentration of power also puts Facebook in a precarious position if Zuckerberg were to be tarnished by scandal or suddenly unable to continue his duties as CEO.

Zuckerberg warned investors when Facebook went public that “Facebook was not originally created to be a company. It was built to accomplish a social mission.” And yet Facebook has flourished into one of the world’s most successful businesses in part because shareholders weren’t allowed to sell its ambitions short.

Read more: https://techcrunch.com/2017/09/22/facebook-sharing/

Watch Mark Zuckerbergs Harvard commencement speech here

College dropout-turned-Facebook CEO Mark Zuckerberg finally got his degree today, and now hes about to give Harvards 366th commencement speech.

You can watch him speak here now, where well embed the Facebook Live broadcast on TechCrunch, and provide frequent updates on any news or insights he mentions.

For a deeper look at the substance of his talk, read our follow-up: Zuckerberg tells Harvard we need a new social contract of equal opportunity

Ill share what Ive learned about our generation and the world were all building together Zuckerberg writes. This is personally important to me and Ive been writing it for a while.

Live Updates From Zuckerbergs Speech

Zuckerberg began his speech by calling Harvard The greatest university in the world, and cracking a couple corny jokes like telling students You accomplished something I never could.

He described how he met Priscilla Chan at the going away party friends through Zuckerberg when the university threatened to kick him out for creating Facebook-predecessor FaceMash. In a touching moment, he says because it led him to meet his future wife, FaceMash is actually the most important thing he built at Harvard.

Then Zuckerberg got into the focus of his speech: Purpose. He described how through his travels around the country, people have told him theyre trying to fill a void in their lives as jobs and community become less important in modern society. And that Zuckerberg foreshadowed how these problems could worsen as technology replaces jobs.

His first strategy for the world to find purpose is for people to make the hard choice to get started on big projects. For example, it might be tough to start fighting climate change, but we can put people to work installing solar panels, or we can start ending disease by getting people to contribute their health data and genomes.

His second strategy revolves around equal opportunity. He believes its time for our generation to define a new social contract where we measure progress by everyone having a role and a purpose. Zuckerberg suggested universal basic income, affordable childcare, flexible healthcare, prison reform, and continuous education as the ways to provide this equal opportunity.

Finally, he believes that we need to build community, both locally with our neighbors, and between nations to unite the globe.

To learn how Zuckerberg plans to fix the worlds problems without just saying Facebook is the solution, read our follow-up:Zuckerberg tells Harvard we need a new social contract of equal opportunity.

Read more: https://techcrunch.com/2017/05/25/watch-mark-zuckerberg-speech/