Social media manipulation: “we have met the enemy and he is us”

Für die deutsche Version klicken Sie hier ]

Social media manipulation: 

 

[with apologies to Walt Kelly]

For Part 1, the power of GAFA, please click here

 

 
 

21 December 2017 (Paris, France) – Last month the U.S. Senate Intelligence Committee grilled the chief legal officers of Facebook, Google, and Twitter, an event that told us something we already knew: Russia manipulated the U.S. Presidential election results in 2016. We may not have known the scale of the effort, or its low financial cost to the Russians (with new revelations emerging every day), and perhaps it was surprising that the Russians started as far back as 2010 (although probably even further based on new accounts this month) … but we pretty much knew the story.

The spectacle reaffirmed something else we knew, but had forgotten: industry self-regulation rarely works. From turn-of- the-century railroads, through energy markets in the 1990s, to the financial industry circa 2007, there are many examples that bear this out. The tech industry is only the latest case in point. But that horse has left the barn.

And, no, those lawyers sitting in the witness chair surely are not going to solve all those issues. Let’s park those thoughts of “ethical standards” at the curb. There is money to be made. The entire culture of these encrusted corporate in-house law departments and the law firms representing these companies are simply a professional trust that employs its skills to make sure their clients get what they want, the undermining of democracy be damned.

But that is also what hit me.  The folks that should have been sitting in those chairs were Mark Zuckerberg, Larry Page, Sergey Brin and Jack Dorsey – the founders of these companies, the guys behind the algorithms. The “enablers”. In fact, they were going to be there but Sheryl Sandberg, Facebook’s chief operating officer, took on the role of negotiator (manipulator?) for the group and convinced the Senate committee that those guys were pretty busy and besides, most of these questions are legal so we’ll get the legal officers together for you. Why her? Because she “walks the walk”. For 6 years she worked in Washington for Larry Summers who was then serving as Secretary of the Treasury under Bill Clinton, walking the halls of Congress. She knew the drill, the highways and byways, who to talk to, how to schmooze them.

What? These guys talk among themselves for a common defense? Well, you might have forgotten that (once) secret Silicon Valley “no-poaching cartel” they formed which was uncovered a few years back whereby they had agreed to stop robbing each other’s talent. Disclosed, a few fines paid, forgiven.

But as I said, they were busy. Poor Jack Dorsey could barely squeeze out 4 hours at a magazine photo shoot that week.

But the key, key take-away for me was how intimately involved these lawyers were with their companies business plans and operations, and the development and structure of the algorithms and platforms. It is why I scoff at these “AI and bias” presentations at legal conferences when the speakers and presenters tell us “what we must do” and the “ethics that must be employed”.  That the big platforms must “become more transparent” and that “clearer standards are in place concerning the flagging and removal of destructive content”. Of course, no where on any of these panels will you find the lawyers for Facebook, Google, Twitter et al. 

Granted, perhaps as a company attorney (or even a company data scientist) you may not have too much say in product decisions. But as a lawyer, for God’s sake, you should think, ask questions and raise issues. That is your function. Did any lawyer (or coder for that matter) ever stop and think about how bad actors could misuse their technology? Did any lawyer ever ask:

  • Can anybody buy ads?
  • Can anybody access the functions?
  • How could trolls use our service to harass vulnerable people?
  • How could an authoritarian government use our work for surveillance?
  • How could our work be used to spread harmful misinformation or propaganda?
  • What safeguards could we put into place to mitigate the above?

What we got from those companies sitting at that Senate event was contrition and apologies. It did not add up to any significant shift, either in business model or philosophy. Rather, just vague promises to “do better,” palliative remedies, and claims that they simply can’t track the complexity of their own algorithms. It underscored their contempt for the rest of us and their blatant impunity toward what anybody thinks. As I noted in Part 1 of this series, an almost verbatim repeat of Sandberg’s exhortation:

According to Sheryl Sandberg in a statement in October, Facebook “never intended or anticipated this functionality being used this way – and that is on us.” It was a candid admission that reminded me of a moment in Mary Shelley’s “Frankenstein,” after the scientist Victor Frankenstein realizes that his cobbled-together creature has gone rogue: “I had been the author of unalterable evils,” he says, “and I lived in daily fear lest the monster whom I had created should perpetrate some new wickedness.”

The sophisticated critiques of Facebook are not about ideas and content that people don’t like, but rather the new structural forces that Facebook has created. News and information flow differently now than they did before Facebook; capturing the human attention that constitutes that flow is Facebook’s raison d’être (and value to advertisers). Taken as a group, these companies have destabilized traditional, fact-based journalism. They have vast wealth — a combined market value of $1.2 trillion — and the power that accompanies it.

The key point is this: the internet’s aristocracy arose with the commercialization of the web, starting 20 years ago. To the hectic leaders of Silicon Valley, two decades seems like geologic time – but in the evolving social and political cultures of our species, two decades is the blink of an eye. We have not come close to absorbing the implications of these potent, purposefully disruptive technologies.

And let me kill off a comment I hear at every conference: “Technology isn’t the problem. It’s our technology habits”. WRONG! If you do not “get” that these technologies are designed to instill those very technology habits, you don’t “get” the equation. Our vulnerabilities arise from commerce – specifically, the tech-enabled exploitation of personal data by businesses – that we allowed to cultivate. The algorithms of click bait, echo chambers and adtech have their basis in BF Skinner’s rat-in-the-box experiments. Russia is just one example of how anybody can take advantage of these systems. The alt-right is another example.

Let’s face it. We created thisWe allowed these systems to develop in this way. This bullshit “move fast and break things” that the tech lords love to quote worked totally in their favor.

One of my favorite conversations at Cannes Lions this year was with a public relations firm working for Facebook. Zuckerberg’s PR consultants … he has four PR companies on retainer … have always told him to “control the narrative”. So dear Mark talks about the “hacker creed” and adds in his “faux debate” with Elon Musk about AI ethics. Good job, Mark.  Keep our eye off the ball.

It is the American way: to deliver the same basic goods in faster, cheaper, more streamlined  fashion. It is always the pursuit of money.  For in America, money is the primary goal.  If the conceit of our democratic social order gets savagely disfigured along the way, and any number of unfortunate outgroups get wiped out in the process, well, such are the wages of quantum lifestyle improvement in the American grain.

Let’s simplify it:

Companies selling digital ads rely on two sources of value: data on individuals and engagement. The more time we spend on Facebook, the more engaged we are, the more ads we see, and the more valuable those ads become. That’s the essence of Facebook’s business. Sean Parker, Facebook’s first president, in an interview last week: “The thought process that went into building these applications was all about: ‘How do we consume as much of your time and conscious attention as possible? How do we make money?’ And that means that we need to sort of give you a little dopamine hit every once in a while. Yes, we are exploiting a vulnerability in human psychology. We, the inventors, understood this consciously. But we did it anyway.”
 

Yes, Facebook – in its essence an advertising company – became a media company, but never lost its structured indifference to the content on its site except insofar as it helps to target and sell advertisements. A version of Gresham’s law is at work, in which fake news, which gets more clicks and is free to produce, drives out real news, which often tells people things they don’t want to hear, and is expensive to produce. In addition, Facebook uses an extensive set of tricks to increase its traffic and the revenue it makes from targeting ads, at the expense of the news-making institutions whose content it hosts. Its news feed directs traffic at you based not on your interests, but on how to make the maximum amount of advertising revenue from you. As Zeynep Tufekci said in her recent brilliant TED Talk “we’ve built a dystopia just to make people click on ads”.


So Facebook and Zuckerberg did not fail. It’s not as if a Facebook engineer in Menlo Park personally greenlighted Russian propaganda, for example. They achieved exactly what was intended. Few of these issues stem from willful malice on the company’s part. As a Facebook engineer told me:

“we are a technology company, not an intelligence agency or an international diplomatic corps. Our engineers are in the business of building apps and selling advertising, not determining what constitutes hate speech in Myanmar. And with two billion users, including 1.3 billion who use it every day, moving ever greater amounts of their social and political activity onto Facebook, it’s possible that the company is simply too big to understand all of the harmful ways people might use its products.”

And as a ex-Facebook engineer told me: “This shit is baked in.  It’s not only that these ad systems are governed by algorithms, but that the software is increasingly guided by artificial intelligence tools that automate systems in ways even we … the creators … do not fully comprehend.”

Facebook did not grow into a $500bn business by being “a force for good in democracy”. That wasn’t its job. Nor was it by “making the world more open and connected” (its first mission statement) and “bringing the world closer together” (its new mission statement). Facebook grew to its current size, influence and wealth by selling advertisements. And it sold those advertisements by convincing users to provide it with incredibly intimate information about our lives so that advertisers could in turn use that information to convince us to do things. Some advertisers want us to buy things. Some want us to attend events. And some want us to vote a certain way. Facebook makes all that advertising cheap and easy and astonishingly profitable by cutting out the sales staff who, in a different kind of company, at a different time, would have looked at an advertisement before it ran.

So listen: Facebook’s systems didn’t “fail” when they allowed shadowy actors connected to the Russian government to purchase ads targeting American voters with messages about divisive social and political issues. They worked. Elliot Schrage, Facebook’s vice-president of policy and communications, on his blog, writing about the flood of Russian ads hitting Facebook:

There was nothing necessarily noteworthy at the time about a foreign actor running an ad involving a social issue. It was merely a customer.

But alas, an unintended monster has been unleashed.

I have thought quite a bit about this, this America in 2017, this Trumpian era of unfettered gun rights and blazing hostility. And it is global. We seem gripped in the geopolitics of a chaotic adolescence of the twenty-first century. Whether alt-Right or radical Islam, the values of liberal and open democracy increasingly appear to be losing ground around the world to those of narrow, xenophobic ethno-nationalisms and radical ideologies.

Look across America, Europe, Asia. Over the developed world. Nationalist and neo-fascist movements backed by unscrupulous corporate cartels are capitalizing on fear and uncertainty to slither their way into power and buy up the world’s executive systems.

And, yes, it is all related. As Jimmi Condrano, a media analyst at the European Journalism Centre, puts it:

When I travel to the Middle East, and even the Midwest in America, even when I was in Charlottesville, Virginia, I see the same thing. As young people become unmoored from traditions they flail about in search of a social identity that gives personal significance and glory. Individuals radicalize to find firm identity in a flattened world. In this new reality, vertical lines of communication between generations are replaced by horizontal peer-to-peer attachments that can span the globe, albeit in vanishingly narrow channels of ideas and information. That’s why despite all its vitriol against “globalists”, in America, today’s alt-Right movement involves the same narrow-minded global weave of tweets, blogs and chatrooms linking physical groups across the world as does the jihadi movement. It is uncanny.

Scott Atran (who has twin roles as the director of research in anthropology at the École Normale Supérieure in Paris, and as senior research fellow at the University of Oxford), noted in a recent presentation:

Look back as far as 1941. Read Erich Fromm’s “Escape from Freedom”. What did he say? He argued that too much freedom caused many to seek elimination of uncertainty in authoritarian systems. He said is you take democracy to its conclusion, you will breed distrust. To be free to do whatever you want can lead, might lead to many problems and divisions and corruption in society.
 

Today, I see it as a “search for significance”, propelling both violent jihadists and militant supporters of populist ethno-nationalist movements worldwide … even your American alt-right.

And so this new world disorder has a way to voice itself: social media. Spreading noxious, violent memes. Funny, actually. Here we all thought that these Western technology creations and our relatively open markets would simply dominate the global social and political order … and bring stability.

In America, schoolyard bullies seem to be running the world – and millions are running after them. There is violence, as trolling has now become a recognized political strategy with a presidential seal of vengeful glee. Let’s externalize your existential crisis by blaming migrants and Muslims for your broken sense of self in a burning world. No, it will not actually solve anything or help you to acknowledge any of your problems. But it might make you feel better!

What I’d like to do is discuss the cognitive effect off all this, how this noxious and violent social environment has led to our youth suffering anxiety levels and disorders unheard of in my day. Nearly one in five American adults over the age of thirteen suffer from an anxiety disorder, with women twice as likely to be affected. As my oldest granddaughter put it: “You know what the trouble is today? It’s everything happens so much!!” The wisdom of youth. To be pondered in another post.

Suffice it to say that this age of anxiety did not begin with this American presidency, nor does it end at the U.S. border, but there is something fundamentally “American” about the culture of dogged, dead-eyed competition that produces it. It’s what happens when the American Dream becomes a nightmare you can’t wake from, and not just because you haven’t had a proper night’s sleep in years. It’s what happens when a society clings to a defining mythos that celebrates working until you drop, abhors poverty as evidence of moral failure, considers the provision of a basic safety net a pansy European affectation, and continues to call itself free.

COMING UP NEXT YEAR IN THIS SERIES …

Being a bit of a gonzo journalist I sometimes ask myself “What would Hunter Thompson do in this situation?” Well, I think he’d probably eat a cocktail of Valium and LSD and go on a bender down Sunset Boulevard to see how many cops he could antagonize before he got arrested.

Ok, maybe I shouldn’t ape the cocaine-and-testosterone-fueled gonzo journalist model of the 1970s. Silly idea. Hunter Thompson and his pals, after all, were dealing with a society whose dominant political affect was boredom, and we’re dealing with anxiety as the relational mode of the age.

Veteran readers of my blog know my style of writing. The model I try to follow is more like the British magazine tradition of a weekly diary – on the issue, but a little distant from it, personal as well as political, conversational more than formal.

In this past year, having skidded across Berlin, Chicago, D.C., Kiev, Moscow, Paris, Rome, and Zurich to produce this series, in the next two instalments:

  • I want to to explain in detail how Facebook’s mechanics (and Twitter, too) have allowed (and still allow) Russian operatives (and the alt-right) to reach far beyond just those people who followed their fake pages and accounts, how the Russians and Chinese have even infiltrated LinkedIn, online games, etc. It is all just one piece of this puzzle.
  • I want to address Russia. Whatever you might think of Russia’s recent antics on the world stage, you have to concede: they have brilliantly exploited information-age tools to confuse audiences about what is truth, what isn’t, and to set their own narrative. The returns have been massive … and out of all proportion to the modest investment. Nope. This isn’t quite the millennium we were promised. The data-bedazzled twenty-first century was to be a time of painlessly enhanced social justice and seamless market accommodation. Oh, that marvelous “arc of history” was surely bent unmistakably toward a bigger, shinier “Information Age”! The propaganda of progress!! The illuminism of new technologies!! Eh … no. Somebody clearly did not get the email. Russia’s intelligence services decided years ago to make information warfare and cyber warfare a national defense priority. We’ll take a deep dive.

MERRY XMAS

 
 
And coming next week, my last post of the year: 
 
 
52 Incredible Things I Learned At Technology Conferences This Year: 
A Weekly Waltz Through 2017
 
(with French, German and Italian versions)

Leave a Reply

Your email address will not be published. Required fields are marked *