I hadn't really thought of it until now, and that sounds pretty good actually, heh.
In any case, Jacob Weisberg reviews two books that I've promoted here, Siva Vaidhyanathan's, Antisocial Media: How Facebook Disconnects Us and Undermines Democracy, and Jaron Lanier's, Ten Arguments for Deleting Your Social Media Accounts Right Now.
At the New York Review, "The Autocracy App":
"By training its users to elevate feelings of agreement and belonging over truth, Facebook has created a gigantic 'forum for tribalism.'" https://t.co/Ll1dnaTTtw via @nybooks— Sarah V. Schweig (@sarahvschweig) October 16, 2018
Facebook is a company that has lost control—not of its business, which has suffered remarkably little from its series of unfortunate events since the 2016 election, but of its consequences. Its old slogan, “Move fast and break things,” was changed a few years ago to the less memorable “Move fast with stable infra.” Around the world, however, Facebook continues to break many things indeed.Keep reading.
In Myanmar, hatred whipped up on Facebook Messenger has driven ethnic cleansing of the Rohingya. In India, false child abduction rumors on Facebook’s WhatsApp service have incited mobs to lynch innocent victims. In the Philippines, Turkey, and other receding democracies, gangs of “patriotic trolls” use Facebook to spread disinformation and terrorize opponents. And in the United States, the platform’s advertising tools remain conduits for subterranean propaganda.
Mark Zuckerberg now spends much of his time apologizing for data breaches, privacy violations, and the manipulation of Facebook users by Russian spies. This is not how it was supposed to be. A decade ago, Zuckerberg and the company’s chief operating officer, Sheryl Sandberg, championed Facebook as an agent of free expression, protest, and positive political change. To drive progress, Zuckerberg always argued, societies would have to get over their hang-ups about privacy, which he described as a dated concept and no longer the social norm. “If people share more, the world will become more open and connected,” he wrote in a 2010 Washington Post Op-Ed. This view served Facebook’s business model, which is based on users passively delivering personal data. That data is used to target advertising to them based on their interests, habits, and so forth. To increase its revenue, more than 98 percent of which comes from advertising, Facebook needs more users to spend more time on its site and surrender more information about themselves.
The import of a business model driven by addiction and surveillance became clearer in March, when The Observer of London and The New York Times jointly revealed that the political consulting firm Cambridge Analytica had obtained information about 50 million Facebook users in order to develop psychological profiles. That number has since risen to 87 million. Yet Zuckerberg and his company’s leadership seem incapable of imagining that their relentless pursuit of “openness and connection” has been socially destructive. With each apology, Zuckerberg’s blundering seems less like naiveté and more like malignant obliviousness. In an interview in July, he contended that sites denying the Holocaust didn’t contravene the company’s policies against hate speech because Holocaust denial might amount to good faith error. “There are things that different people get wrong,” he said. “I don’t think that they’re intentionally getting it wrong.” He had to apologize, again.
It’s not just external critics who see something fundamentally amiss at the company. People central to Facebook’s history have lately been expressing remorse over their contributions and warning others to keep their children away from it. Sean Parker, the company’s first president, acknowledged last year that Facebook was designed to cultivate addiction. He explained that the “like” button and other features had been created in response to the question, “How do we consume as much of your time and conscious attention as possible?” Chamath Palihapitiya, a crucial figure in driving Facebook’s growth, said he feels “tremendous guilt” over his involvement in developing “tools that are ripping apart the social fabric of how society works.” Roger McNamee, an early investor and mentor to Zuckerberg, has become a full-time crusader for restraining a platform that he calls “tailor-made for abuse by bad actors.”
Perhaps even more damning are the recent actions of Brian Acton and Jan Koum, the founders of WhatsApp. Facebook bought their five-year-old company for $22 billion in 2014, when it had only fifty-five employees. Acton resigned in September 2017. Koum, the only Facebook executive other than Zuckerberg and Sandberg to sit on the company’s board, quit at the end of April. By leaving before November 2018, the WhatsApp founders walked away from $1.3 billion, according to The Wall Street Journal. When he announced his departure, Koum said that he was “taking some time off to do things I enjoy outside of technology, such as collecting rare air-cooled Porsches, working on my cars and playing ultimate Frisbee.”
However badly he felt about neglecting his Porsches, Koum was thoroughly fed up with Facebook. He and Acton are strong advocates of user privacy. One of the goals of WhatsApp, they said, was “knowing as little about you as possible.” They also didn’t want advertising on WhatsApp, which was supported by a 99-cent annual fee when Facebook bought it. From the start, the pair found themselves in conflict with Zuckerberg and Sandberg over Facebook’s business model of mining user data to power targeted advertising. (In late September, the cofounders of Instagram also announced their departure from Facebook, reportedly over issues of autonomy.)
At the time of the acquisition of WhatsApp, Zuckerberg had assured Acton and Koum that he wouldn’t share its user data with other applications. Facebook told the European Commission, which approved the merger, that it had no way to match Facebook profiles with WhatsApp user IDs. Then, simply by matching phone numbers, it did just that. Pooling the data let Facebook recommend that WhatsApp users’ contacts become their Facebook friends. It also allowed it to monetize WhatsApp users by enabling advertisers to target them on Facebook. In 2017 the European Commission fined Facebook $122 million for its “misleading” statements about the takeover.
Acton has been less discreet than Koum about his feelings. Upon leaving Facebook, he donated $50 million to the Signal Foundation, which he now chairs. That organization supports Signal, a fully encrypted messaging app that competes with WhatsApp. Following the Cambridge Analytica revelations, he tweeted, “It is time. #deletefacebook.”
The growing consensus is that Facebook’s power needs checking. Fewer agree on what its greatest harms are—and still fewer on what to do about them. When Mark Zuckerberg was summoned by Congress in April, the toughest questioning came from House Republicans convinced that Facebook was censoring conservatives, in particular two African-American sisters in North Carolina who make pro-Trump videos under the name “Diamond and Silk.” Facebook’s policy team charged the two with promulgating content “unsafe to the community” and indicated that it would restrict it. Facebook subsequently said the complaint was sent in error but has never explained how that happened, or how it decides that some opinions are “unsafe.”
Democrats were naturally more incensed about the twin issues of Russian interference in the 2016 election and the abuse of Facebook data by Cambridge Analytica in its work for Trump’s presidential campaign.
This comment has been removed by a blog administrator.
ReplyDelete