Facebook: it’s like a teenager with launch codes … frivolous. And deadly.

Pour lire cet article en français, veuillez cliquer ici 

Um diesen Artikel auf Deutsch zu lesen, klicken Sie bitte hier 

 

 

16 November 2018 (Berlin, Germany) – If you pull far enough back from the day to day debate over technology’s impact on society – far enough that Facebook’s destabilization of democracy, Amazon’s conquering of capitalism, and Google’s domination of our data flows start to blend into one broader, more cohesive picture – what does that picture communicate about the state of humanity today?

Technology forces us to recalculate what it means to be human – what is essentially us, and whether technology represents us, or some emerging otherness which alienates or even terrifies us. We have clothed ourselves in newly discovered data, we have yoked ourselves to new algorithmic harnesses, and we are waking to the human costs of this new practice. Who are we becoming?

Nearly four years ago many of the major technology pundits (John Battelle and Evgeny Morozov among others) predicted that the bloom would fade from the technology industry’s rose, and so far, so true. But as we begin to lose faith in the icons of our former narratives, a nagging and increasingly urgent question arises: just how in hell did we get here? I really began to focus us this back in 2016 when I viewed our public square: data, code and algorithms were driving our civil discourse. We no longer shared one physical, common square, but instead struggle to comprehend a world comprised of a billion “Truman shows”.

That is the focus of my book-in-progress, a (somewhat) short history of technology. And how we got here. Not all technology, of course. Not the lathe, the powerloom, the motor car, the MRI scanner or even the F1 fighter jet. I mean specifically the digital media platforms associated with Silicon Valley. Those social media, big data, mobile technology and artificial intelligence platforms that increasingly dominate our economic, political and social life.

But today’s post is not about my book but about a New York Times article about Facebook that is the talk of the internet this week. This story is devastating to any remaining illusions about Facebook. You need to read it a few times to grasp it all but my guess is you’ll arrive at the same place as I did: babes in the political woods try to play both sides, screw it up. Bent on growth, Facebook ignored all the warning signs, all the internal alarm bells … and then sought to conceal them from public view. They went on the attack as one scandal after another … Russian meddling, data sharing, hate speech, ethnic cleansing … hit the corporate suites. Low motives, high comedy, zero integrity. The article was behind the New York Times paywall so I posted it on my SlideShare page. You can read it by clicking here.

My big Sheryl take-away? Sandberg’s failure at Facebook exposes an emptiness at the heart of the argument that made her famous. Women, Sandberg argues in her 2013 mega-selling book, “Lean In” should take a seat at the table. That’s all well and good. But what should they do once they’re sitting there? Sandberg herself, consummate table-sitter, has offered an answer over her company’s year of horrors: keep everything exactly the same.

And as for Mark … it just reinforces the inanity of all this, and that dubious unstated premise that used to start all pieces on Facebook: that Facebook has made a valuable contribution to society or human welfare. It hasn’t. Zuckerberg … thanks to his own drive and brilliance and ability to operate with impunity … has become one of the most powerful unelected people in the world. If Zuckerberg is so brilliant, why doesn’t he fix Facebook? Because he has no intention of fixing something that makes him that much money.

Because that’s the key. For that system to work the way it was designed to, Facebook had to maintain a veneer of neutrality — i.e., non-complicity in the uses to which bad actors put Facebook’s engine — which is why you saw Zuckerberg recently trying to thread a needle on Holocaust denial. He wants to profit from its popularity on his platform without feeling bad about it.

Facebook … like Amazon … became a surrogate for consumer and social action, delocalizing social and physical commerce in local markets for goods and services. Offering consumers internet-mediated transactions at larger spatial scales. In the U.S., Amazon was worse. It short-circuited much local business, killing them and the town center model of urban life in America. It led to huge job losses that might have sustained a lot of people, just as U.S. corporations allowed manufacturing jobs to decline, hiding it behind “progressive automation and process efficiencies”. And “job migration”.

Ok, I said “worse” than Facebook. Reality check. As the New York Times article states (as have many other pundits) as Facebook grew, so did the hate speech, bullying and other toxic content on the platform. When researchers and activists in Myanmar (which experienced massive ethnic cleansing, directly attributed to Facebook postings), India, Germany and elsewhere warned that Facebook had become an instrument of government propaganda and ethnic cleansing, the company largely ignored them. Facebook had positioned itself as a platform, not a publisher. As internal emails would reveal, taking responsibility for what users posted, or acting to censor it, was “expensive and complicated”. And many Facebook executives worried that any such efforts would backfire.

Facebook could not thrive/cannot thrive unless we also have delocalized social systems, as people move more for job opportunities and don’t participate as members of a sustainable local community structure. I know. A huge concept and I am giving it short shrift. More to come in my book.

The doomsday machines

Social media are doomsday machines. Especially Facebook. They distract, divide, and madden. As a result, our social, civic, and political ligands are dissolving.

It’s simple, really. When intelligent people affiliate themselves to ideology, then their intellect ceases to guard against wishful thinking. What happens instead is it begins to fortify itself against intellect. So they mastermind their own delusion. They, in effect, very cleverly become stupid. So you need to read and watch everything. Read everybody. Because as Ian Bremmer says “if you’re not following some people you disagree with (or even dislike), you’re doing it wrong”.

So, it’s a struggle. As tough as it is given the demands on our time, you can’t be lazy. Reasoning and critical thinking enable people to distinguish truth from falsehood, no matter where they fall on the political spectrum.

But the platforms work against us. Default settings like push notifications, autoplaying videos, algorithmic news feeds – they all cater to humans’ inclination to consume things passively instead of actively, to be swept up by momentum rather than resist it. This isn’t baseless philosophizing; most folks just tend not to use social media to engage critically with whatever news, video, or sound bite is flying past.

The result? People are blinded by ideology and partisanship, and leverage their critical-thinking skills to ram the square pegs of bias and misinformation into the round holes of their particular ideologies. Fake news doesn’t so much evade critical thinking as weaponize it, preying on partiality to produce a feedback loop in which people become worse and worse at detecting misinformation.

So wherever we are in the world, all we see are people chatting, feeding from low intellect sites, reaffirming their biases or deactivating their capacity to think. And on TV, what is there to find but mostly reality shows, bad movies, etc. We have gone from a world of unconnected ignorants to a world of hyper connected ones. Sounds harsh but that’s basically it.

Granted: one of the most difficult things in any environment is to remain objective. But the gathering, mixing and matching of data and the way it is analyzed … its communication and exposure of personal information and finance … should be a concern to everyone. Boundaries are being tested in all critical environments. People are accepting the unseen through the eyes of past generations totally ignorant to the magnitude of the changes taking place. Technology is being driven by sectors that test our religious convictions and ethical compass.

But the worst? Everywhere, people consult their screens to affirm what they already think and repeat what like-minded people have already said. They submit to surveillance and welcome algorithmic manipulation. Believing absurdities, they commit injustices. A few lose their minds altogether. We’ve done a number on ourselves. Everyone knows this: those platforms and algorithms, which many hoped would enlarge the best in humanity, have liberated the worst.

Digital capitalism

The technology is fascinating, but so is the lack of desire to question and follow the path of data and its acquisition by businesses in different sectors blindly, in a path to power that we have never seen. The core of that power is sometimes as untraceable as the networks it created.

The digital revolution that initially promised to be a socio-economic leveller has ironically become the harbinger of a new global socio-economic order of unprecedented exploitation and injustice. The Internet is a far cry from the plural, diverse and democratic communication architecture that the millennial generation optimistically aspired for. In the digital markets of today, mobile and cloud technologies have led to the proliferation of “walled gardens”; the carving up of the digital commons into proprietary platforms that seek to lock in users permanently. The initial impetus for such platformization came from the quest of digital corporations to maximise click-bait revenues by leveraging demand-side economies of scale and data-based profiling for targeted advertising to a captive user base. But the business model of these corporations has now moved beyond advertising to the deployment of their “platform advantage” – the creation of intelligence-based solutions, based on massive user data repositories built over time.

Just one example I noted in a previous post: Google sees its future in its Cloud Auto ML suite of customizable machine learning products for businesses … developed, of course, from the existing behavioral data sets it owns and controls … rather than its ad revenues from web search.

In the research for my book I realized that this “new” data (as raw material and digital intelligence solutions) had created a new factor of production. It has opened up a complete overhaul of the marketplace that the antitrust laws have no chance of correcting.

Not only Big Tech. Look at how traditional corporate giants have reinvented their game. Take the case of the agricultural sector. Even as Alibaba is strengthening its AI-supported ET Agricultural Brain platform, agro-chemicals corporation, Bayer, has taken over Monsanto, the seed behemoth. With the acquisition, Bayer has created a one-stop shop for seeds, crop chemicals and computer-aided services to farmers, with an eye on consolidating its hold over the farm supplies industry. And it all flew past the global antitrust regulators.

Ok. Just a brief word about antitrust and government regulation.

What sets the new digital superstars apart from other firms is not their market dominance; many traditional companies have reached similarly commanding market shares in the past. What’s new about these companies is that they themselves are markets:

  • Amazon operates a platform on which over $200 billion worth of goods are bought and sold each year.
  • Apple runs gigantic marketplaces for music, video, and software.
  • As the world’s largest music-streaming service, Spotify provides the biggest marketplace for songs.
  • The Chinese e-commerce behemoth Alibaba manages the world’s largest business-to-business marketplace.
  • Meanwhile, Google and Facebook are not just the dominant search engine and the dominant social media platform, respectively; they also represent two of the world’s largest marketplaces for advertising space.

Yes, markets have been around for millennia; they are not an invention of the data age. But digital superstar firms don’t operate traditional markets; theirs are rich with data.

This focus on data privacy has obscured the real dangers of these information monopolies  – and the tech monopolies like it that way. Data issues are far easier to discuss and address. Of course, legal vendors love it because it is easy to flog and sell, those “data privacy solutions”.

But data protection is futile. Cesar Ghali of the Computer Science Department at the University of California Irvine has done a study of the networking architecture of social media (and media in general) and the dissemination/flow of information, looking at the collection of personal data and targeted advertising, and news sites, among others. In a nutshell … and you have undoubtedly heard this before … he says:

In today’s digital world, we’re continuing to upload more of our personal information to all of these interwebs in exchange for convenience. As such, some degree of information governance should be necessary to protect us. Companies should be held accountable for misusing people’s data. But they will not to any serious degree. Why? Because the likes of Google, Facebook and Amazon generally making your life easier and/or better, and it means the very algorithms that enable this need to be fueled by massive amounts of data. The more data they have, the more capable and sophisticated they become. And we’ll keep feeding them. Because they make life easy. It’s that simple.

Platformization has become the order of the day in every domain, not just Facebook/social media. Agriculture, retail trade, transportation, tourism, banking, finance, on-demand services: name it, and you can spot it. This shift has fostered monopolistic tendencies and market concentration across the economy. From the Politico study on global antitrust:

Today, companies with more than USD one billion in annual revenue account for nearly 60 percent of total global revenue and 65 percent of market capitalisation. Big Tech companies are on top of this heap, occupying five of the top 10 slots in the list of the largest companies in the world by market cap. They often abuse this enormous power they wield, through pervasive dataveillance of users and workers, cavalier violations of and a total disregard for national legislation in the markets they operate, including brazen violation of competition law.

Note to readers: over the last two years, the “brazen violation of competition law” by Big Tech has been the subject of numerous law review articles and other journals. My research unit is creating a readers’ list of the best/primary antitrust/competition pieces you should read. It will be distributed in January. 

To millions of people around the world Facebook is the Internet. We struggle (do we, really?) against “platform lock”. We struggle (do we?) to respond to this scenario of “all power to corporations”. And, man, it is complicated. How do you respond to the deeply inimical consequences of the intertwining of the digital revolution with neo-liberal capitalism? Too much for this post.

Because the primary challenge confronting us is no longer the ad-based revenue model for content services adopted by platform companies; it is a new business model … data extractivism … predicated on digital intelligence solutions — both prepackaged and business-specific — that Big Tech companies offer for wide application across industries. Digital corporations located in countries across the Western world rely upon unrestricted cross-border data flows to consolidate their hold over markets across the rest of the world.

And that recently passed General Data Protection Regulation (GDPR) in Europe? Don’t make me laugh. Big Tech (and its lawyers and lobbyists) played this perfectly. “Yes, let’s focus on protections designed to provide oversight and control over how data is collected and processed. Let’s talk about the ‘right to be forgotten’. Yes, it’s all about people’s privacy, identity, reputation, and autonomy”.

Big Tech made sure kept we kept our eyes off the real ball: protections not about the inputs, but the outputs of data processing. People should have been fighting for “the right to reasonable inferences”. European data protection law and jurisprudence currently fail in this regard. The GDPR … while laudable on many fronts … offers little protection against the novel risks of inferential analytics. And that is what Big Tech wanted to protect. And did. And THAT will require another post.

Jo Freeman warned us about today’s Facebook. Back in 1973.

I am in Berlin for a two-day workshop that pretty much covers everything I have just noted: Facebook’s destabilization of democracy, Amazon’s conquering of capitalism, and Google’s domination of our data flows, and our public square: data, code and algorithms driving our civil discourse.

We talked about Jo Freeman, someone I knew from my research. American feminist, political scientist, writer and attorney. You can read her bio on Wikipedia here. She is best known for one particular essay she wrote in 1973, The Tyranny of Structurelessness. As one writer noted about her “she is one of those people whom Kipling would praise for keeping her head while those all around are losing theirs”. 

She had been speaking and writing about the women’s liberation movement of the late 1960s which was rebuilding the world in a consciously different way: no designated leaders and no rules on what you could say and when you could say it. Yet Freeman wondered if getting rid of rules and leaders was actually making feminism more open and fair.

After a hard think, she concluded that, if anything, the lack of structure made the situation worse: elite women who went to the right schools and knew the right people held power and outsiders had no viable way of challenging them. She decided to write an essay summing up her thoughts:

“As long as the structure of the group is informal, the rules of how decisions are made are known only to a few and awareness of power is limited to those who know the rules. Those who do not know the rules and are not chosen for initiation must remain in confusion, or suffer from paranoid delusions that something is happening of which they are not quite aware.”

And now, 40 years later, that essay continues to reverberate, especially in Silicon Valley, where it is deployed by a wide range of critics to disprove widely held beliefs about the internet as a force of personal empowerment, whether in work, leisure, or politics. That a decentralized cryptocurrency like bitcoin gives ordinary folk control over high finance. That driving an Uber means you are an entrepreneur. That Facebook’s largely unsupervised advertising system is designed for small businesses because it allows them to reach a sea of customers inexpensively. Or that Silicon Valley companies with their informal college-campus vibe, including study nooks, snack bars, and a free-thinking debate culture, are immune to the need for corporate protection.

The reality, of course, is a bit different. Bitcoin is dominated by a small cadre of investors, and “mining” new coins is so expensive and electricity-draining that only large institutions can participate; Facebook’s advertising system is exploited by foreign governments and other malevolent political actors who have had free rein to spread disinformation and discord; and Google’s informal structure allows leaders to believe they can act in secret to dispense with credible accusations of harassment.

But in Freeman’s unstinting language, this rhetoric of openness “becomes a smokescreen for the strong or the lucky to establish unquestioned hegemony over others.” And because “Tyranny” explains how things work, as opposed to how people say things work, it has become a touchstone for social critics of all stripes.

NOTE: I gave up trying to track all the times her essay has been cited. But my research team has and noted that during the Occupy movement, Freeman’s essay was on the organizers’ minds when they sought to eliminate hierarchy without introducing a hidden hierarchy. The essay is cited in hundreds of academic papers and books to explain the history of the Vatican, or the women’s movement in Iceland, or the Walmart workforce. 

She has been interviewed a lot over the past two years. She takes the long view about her argument, seeing it as part of a permanent push-pull between structure and structurelessness. As Noam Cohen noted in a recent Wired article and also at a Recode event:

There may be particular reasons why Silicon Valley leaders have an aversion to outside authority and rules, but mainly [Freeman] thinks they embody the excessive enthusiasm of any group who gains a foothold in a new field—whether in oil exploration or railroads or the internet—and decides they are uniquely fit to hold that powerful position. In the early days of the internet, she says, “it was highly inventive, it was highly spontaneous, but we’re past that. As long as you reject the idea that any organization is bad you are never going to have the discussion about the best organization for whatever it is you are trying to do.”

And while this rhetoric of personal empowerment has been great for Silicon Valley, for the rest of the world it has produced a deeply painful reality: greater disparity in wealth and power, fewer tools for reversing these conditions, and a false sense that we are personally to blame for our own difficult circumstances.

CONCLUDING NOTE

If there was one clear message from all the chatter these days, it is that the heady utopia of toxic consumerism propelled by Big Tech has been unequivocally rejected. So the question today is whether Silicon Valley companies will voluntarily impose some structure on their magical platforms and devices – or whether that structure will need to be imposed upon them by the public. Assuming there is any political will to do so.

Me? To be sure, populism, nationalism, religious conflict, information warfare, deceit, prejudice, bias … etc., etc., etc. …  existed long before the internet. The arc of history does not always bend toward what we think of as progress. Societies regress. The difference now is that all of this is being hosted almost entirely by a handful of corporations. Why is an American company like Facebook placing ads in newspapers in countries like India, Italy, Mexico, and Brazil, explaining to local internet users how to look out for abuse and misinformation? Because our lives, societies, and governments have been tied to invisible feedback loops, online and off. And there’s no clear way to untangle ourselves.

Leave a Reply

Your email address will not be published. Required fields are marked *

scroll to top