Another example of data privacy being dead-on-arrival: video games

The toxic (but commercially utterly compelling) combination of surveillance capitalism and opportunistic machine learning.

It succeeded for social networks, so why not video games?

 

Players generate a wealth of revealing data – and companies are soaking it up

 

10 February 2022 (Athens, Greece) – When I first wrote about the death of data privacy (over 10 years ago), I began by explaining the infrastructural power of platform firms – the structural and instrumental power they had to prevail in regulatory battles. It stems from their position as mediators across a variety of actors. This power is shaped by pre-existing regulations and the firms’ strategic response, what I called “contentious compliance” – a double movement of adapting to existing regulations, while continuing to challenge them. They created a new business model based on information technology and a lean workforce, taking advantage of loose regulations in the service industries (especially in the U.S.)

You need to understand this structural framework to understand the politics of platform firms, to understand this structural and instrumental power, all stemming from the specific position these firms occupy in the economy. These firms mediate between producers and consumers of goods, services, and information; in so doing, they create a data ecosystem that depends on them. In this way, platform firms become increasingly relevant and dependent in our economies. This is the source of their power.

It’s why I continue to have insane conversations with data privacy “experts” who think data privacy is actually possible. Data is in perpetual motion, swirling around us, only sometimes touching down long enough for us to make any sense of them. We use data, these numbers, as signs to navigate the world, relying on them to tell us pedantic  things like where traffic is worst or what things cost or what friends are thinking or what our friends are doing – and especially placing an Amazon order and letting Amazon “logistic” the delivery of the package. The list is endless.

And because we do this, data has become a crucial part of our infrastructure, enabling commercial and social interactions. Rather than just tell us about the world, data acts in the world. Because data is both representation and infrastructure, sign and system. As media theorist Wendy Chun puts it, data “puts in place the world it discovers”. Restrict it? Protect it? We live in a massively intermediated, platform-based environment, with endless network effects, commercial layers, and inference data points.

I do not begrudge these data privacy folks. It is merely commerce for them. It’s a job. Data privacy “experts” and “consultants” have one goal: to sell, be it their services and/or products. Their mantra is that a regulation is afoot and you must comply with these data regulatory acts – and they will help you. What is happening out there in the real world is not material. Their mantra is “You don’t want to get a big fine, now, do you?” It’s why it is so amusing to watch data privacy panels at almost any legal conference or data privacy conference, stocked with Big Law attorneys selling the need for data privacy – while their other firm colleagues are actually doing the opposite in their representations of Big Tech. Scan any event agenda and look at the law firms and you’ll see what I mean.

Before I get into video games and their role in killing data privacy I want to recap 10 years of posts (!!) for my new readers to provide some perspective.

Starting places

The regulatory state fails time and time again because it fails to understand technology through the lens of the reconstruction of the political economy: the ongoing shift from an industrial mode of development to an informational one. Regulatory institutions continue to struggle in the era of informational capitalism because they simply cannot understand what is happening. Regulatory processes are befuddled by the regulatory issues and problems created by information markets and networked information and communications technologies. Nor do they understand our technological dependency. Most amazing were the U.S. Congressional members who not too long ago were interrogating our relationship with technology but were so quick to say our success getting through COVID (if 1 million deaths can be considered a success) because Big Tech was “our salvation”. Clearly these guys must have ordered s lot of stuff on Amazon.

Yes, I know. It’s not the technology: it’s our implementation of it. Or so the story goes. But, alas, we stand guilty of so much original sin. As I have dug deeper and deeper technology regulation and our culture, I have developed 6 “givens” :

1. Data privacy and data sovereignty did not die of natural causes. They were poisoned with a lethal cocktail of incompetence, arrogance, greed, short-sightedness and sociopathic delusion. Technology has become the person in the room with us at all times. By introducing technology so very rapidly into every aspect of the human existence in such a preposterously short historical period of time. As Julia Hobsbawm says at her TEDx events “Big Tech has thrown a metaphorical person into all our lives”.

2. When a technology is introduced into society, unimpeded, through private enterprise, and then quickly adapted and adopted, the game is over. We (they?) have built a world, a system, in which physical and social technologies co-evolve. How can we shape a process we don’t control?

3. And all those lawyers who say “we must do something about privacy!” are … pardon my French … full of merde. Well, most of them (not all) are actually outliers not involved in the game. It’s their legal brethren and cohorts guilty of forming their own logic of informational capitalism in comprehensive detail. Look at the legal foundations of platforms, privacy, data science, etc. which has focused on the scientific and technical accomplishments undergirding the new information economy. Institutions, platforms, “data refineries” that are first and foremost legal and economic institutions. They exist as businesses; they are designed to “extract surplus”.

4. And when a technology is introduced into society, unimpeded, nobody can catch up. With the continuing development of computer vision and video analytics, facial recognition may ultimately turn out to be one of the more benign applications of camera surveillance. Put another way, facial recognition is just part of a much larger program of surveillance – equipped with higher scale and granularity—associated with cameras, microphones, and the multitude of sensors in modern smart phones. As these systems become more sophisticated and more ubiquitous, we are likely to experience (actually are experiencing) what Big Tech calls “phase transition” – the transition from collection-and-storage surveillance to mass automated real-time monitoring. At scale.

5. What sets the new digital superstars (the “4 Horsemen of the Digital Apocalypse”?) apart from other firms is not their market dominance; many traditional companies have reached similarly commanding market shares in the past. What’s new about these companies is that they themselves ARE the markets. They have introduced technology so very rapidly into every aspect of the human existence in such a preposterously short historical period of time, that the “regulators” cannot cope.

6. And blame the regulators themselves. Corporate lawyers have pushed themes of “innovative flexibility” and “privatized oversight”, playing on an anti-regulatory narrative that has gained increasing traction as the shift to informationalism has gathered speed. And regulators have caved in – as well-noted in those massive the U.S. Congressional Reports on antitrust and privacy. They are now about two years, and total 500+ pages. If you are really into this data privacy stuff, read them. They are chock full of great stuff and links. For the last several decades, advocacy emanating from Wall Street and Silicon Valley has pushed for deregulation and devolution of governance to the private sector, invoking asserted imperatives relating not only to “market liberty” but also and more fundamentally to “innovation and economic growth”. This has proved extraordinarily powerful in structuring public debate about what regulatory goals should be, as well as their methods and sanctions.

Those 4 Horsemen … Google, Amazon, Apple and Facebook … have left the barn. But now you best include TikTok, too. Examining our technological practices – practices that include Fitbit, Chill, and Twitter –  reveals how our nihilism and our technologies have become intertwined, creating a world of techno-hypnosis, data-driven activity, pleasure economics, herd networking, and orgies of clicking.

Video game surveillance

Hat tip to Ben Egliston and Mick Carpenter, experts on the video game industry, for their contributions. Plus “Blood, Sweat, and Pixels: The Triumphant, Turbulent Stories Behind How Video Games Are Made” by Jason Schreier which provided my foundation to understanding the video game industry.

Tech conglomerate Tencent caused a stir last year with the announcement that it would comply with China’s directive to incorporate facial recognition technology into its games in the country. The move was in line with China’s strict gaming regulation policies, which impose limits on how much time minors can spend playing video games – an effort to curb addictive behavior, since gaming is labeled by the state as “spiritual opium”.

The state’s use of biometric data to police its population is, of course, invasive, and especially undermines the privacy of underage users – but Tencent is not the only video game company to track its players, nor is this recent case an altogether new phenomenon. All over the world, video games, one of the most widely adopted digital media forms, are installing networks of surveillance and control.

In basic terms, video games are systems that translate physical inputs – such as hand movement or gesture – into various electric or electronic machine-readable outputs. The user, by acting in ways that comply with the rules of the game and the specifications of the hardware, is parsed as data by the video game. Writing almost a decade ago, the sociologists Jennifer Whitson and Bart Simon argued that games are increasingly understood as systems that easily allow the reduction of human action into knowable and predictable formats.

Video games, then, are a natural medium for tracking, and researchers have long argued that large data sets about players’ in-game activities are a rich resource in understanding player psychology and cognition. One study from 2012 scraped player activity data logged on the World of Warcraft Armory website – essentially a database that records all the things a player’s character has done in the game (how many of a certain monster I’ve killed, how many times I’ve died, how many fish I’ve caught, and so on) as well as their geographical location and other personal information.

The researchers used this data to infer personality characteristics (in combination with data yielded through a separate survey, and data mixed in from other sources). The paper suggests, for example, that there is a correlation between the survey respondents classified as more conscientious in their game-playing approach and the tendency to spend more time doing repetitive and dull in-game tasks, such as fishing. Conversely, those whose characters more often fell to death from high places were less conscientious, according to their survey responses.

Ben Egliston:

Correlation between personality and quantitative gameplay data is certainly not unproblematic. The relationship between personality and identity and video game activity is complex and idiosyncratic; for instance, research suggests that gamer identity intersects with gender, racial, and sexual identity. Additionally, there has been general pushback against claims of Big Data’s production of new knowledge rooted in correlation. Despite this, games companies increasingly realize the value of collecting big data sets to gain insight into what a player likes, how they play, what they play, what they’ll likely spend money on (in freemium games), how and when to offer the right content, and how to solicit the right kinds of player feelings.

While there are no numbers on how many video game companies are surveilling their players in-game (although, as a recent article suggests, large publishers and developers like Epic, EA, and Activision explicitly state they capture user data in their license agreements, one of the few articles I found that was not behind a paywall), a new industry of firms selling middleware “data analytics” tools, often used by game developers, has sprung up.

These data analytics tools promise to make users more amenable to continued consumption through the use of data analysis at scale. Such analytics, once available only to the largest video game studios – which hire scores of data scientists to capture, clean, and analyze the data, and software engineers to develop in-house analytics tools – are now commonplace across the entire industry, pitched as “accessible” tools that provide a competitive edge in a crowded marketplace by companies like Unity, GameAnalytics, or Amazon Web Services. (Although, as a recent study shows, the extent to which these tools are truly “accessible” is questionable, requiring technical expertise and time to implement.) As demand for data-driven insight has grown, so have the range of different services – dozens of tools in the past several years alone, providing game developers with different forms of insight through massive data collection.

One tool – essentially Uber for playtesting – allows companies to outsource quality assurance testing, and provides data-driven insight into the results. Another supposedly uses AI to understand player value and maximize retention (and spending, with a focus on high-spenders).

Developers might use data from these middleware companies to further refine their game (players might be getting overly frustrated and dying at a particular point, indicating the game might be too difficult) or their monetization strategies (prompting in-app purchases—such as extra lives—at such a point of difficulty).

But our data is not just valuable to video game companies in fine-tuning design. Increasingly, video game companies exploit this data to capitalize user attention through targeted advertisements.

As a 2019 eMarketer report suggests, the value of video games as a medium for advertising is not just in access to large-scale audience data (such as the Unity ad network’s claim to billions of users), but through ad formats such as playable and rewarded advertisements—that is, access to audiences more likely to pay attention to an ad.

These advertisements serve numerous ends, such as facilitating user acquisition (ads for other games or apps), and increasingly, brand advertising. Similar to the approach of digital advertising giants Google and Facebook, where the data generated by platform users (clicks, swipes, likes, dislikes, purchases, movements, behaviors, interests, and so on) supposedly facilitates the placement of advertisements in front of the “right” audiences (as Unity’s executives note in a transcript of a recent quarterly earnings call), video game companies are attempting to harness the billions of interactions that take place within their games to create new revenue streams. These companies sell the eyeballs (and perhaps fingers, with playable ads) of their users to advertisers and mobilize data to best match users with advertisers based on the specifications of the advertiser or the software working on the advertiser’s behalf.

The data-richness of video games has also had an impact beyond the video game industry’s attempts to shape player attention. The logic of games is used to gamify functions and derive information that might not have been otherwise volunteered. Indeed, the study of World of Warcraft player motivation frames the value of correlating sentiment or personality with user activity around the growth of gamification in society. To better understand how and why people play games in certain ways is, as the authors suggest, to better understand how to make gamelike interfaces beyond the context of gaming more compelling.

Mick Carpenter:

But the data collection has been going up. For instance, the Go365 health insurance app solicited information from users – such as blood glucose levels, sleep cycle, diet, whether they drink or smoke, or wider family medical histories – using gamification logics of points and rewards to develop (more profitable) personalized insurance profiles. These identified categories of risk that preclude some from certain kinds of insurance or drive up their premiums.

More revealing, a 2017 article in The New York Times explained that Uber’s driver interface used gamification techniques like rewards and points to create a “perfectly efficient system” where driver supply can meet rider demand. Crucially, the article revealed that Uber – through employing both social and data scientists – optimized these systems for compelling continued labor sustaining the platform. Gosh, data collection reveals so many – uses.

Beyond gamification techniques optimized using data, we are beginning to see the use of gamification techniques to generate data about worker performance. Amazon’s warehouses are reportedly beginning to gamify labor to further make workers keep to “Amazon pace” (somewhere between walking and jogging) – a move that quite literally resembles the plot of an episode of the dystopian television show Black Mirror. As The Washington Post has reported, high performance in these (currently optional) games – with titles like MissionRacer, PicksInSpace, Dragon Duel, and CastleCrafter – can be exchanged for “Swag Bucks, a proprietary currency that can be used to buy Amazon logo stickers, apparel or other goods.” Under the guise of gamification, it is not a stretch to imagine how workers may be further disciplined through more invasive data-veillance in order to intensify their productivity at the expense of their welfare.

Because video games are systems that translate human inputs into machine-readable data, they have been afforded important status in driving so-called “Silicon Valley innovation”. One area has been the application of games in the development of AI. In a kind of one-upmanship of chess-playing algorithms, Alphabet’s AlphaStar AI and OpenAI’s OpenAI Five were trained to play the strategy games Starcraft 2 and Dota 2, respectively – famously, besting some of the world’s top players. To do so, these AI were trained using techniques like reinforcement learning, where essentially the AI played matches against itself – churning through thousands of years’ worth of gameplay (and learning from this data) within months.

For these companies, learning to play video games at a high level isn’t the end goal. For a company like OpenAI, training on Dota 2 has applications to physical robotics. Darpa – the Department of Defense’s research and development arm – has sponsored efforts to use games to develop AI for military application. Gamebreaker – a project that engages both academia and industry (including defense and arms contractors like Lockheed Martin and Northrop Grumman)— aims to use video-game-playing “AI to exploit engagement models … to enable intelligent systems that could in turn enhance military strategy.”

Training AI on complex games – games that take humans thousands of hours to master – also serves to drum up support for AI, selling it to investors, policymakers, and publics as something credible amid growing criticism about exaggerated (or outright fraudulent) claims of its efficacy and veracity. If we are told that AI can master Starcraft, then it might make us feel a bit better about the prospect of AI driving a car, assessing debt, and so on.

More speculatively, video games are aligning with the development of new forms of embodied computing interfaces, a training ground for technologies such as brain-computer interfaces (BCI) – augmenting brain capabilities with computation. Valve Corporation founder Gabe Newell, whose company an early adopter of BCI in the video game industry, suggests that BCI enabled games, built into things like future VR headsets, could well track data points telling us whether people are happy, sad, surprised, or bored. In a post I wrote last month I detailed a series of patents granted to Meta that suggest that future augmented and virtual reality headsets (where the company sees gaming as one major application) may use biometric data (such as gaze, in one patent) leveraged for purposes such as advertising. In this sense, not only can games be used to make inferences about us from our choices, the value proposition of forms of embodied computing interfaces from VR to AR to BCIs is to provide access to the physiological processes underpinning those choices. And there are absolutely no legal constraints on any of that.

Consternation about digital technologies harvesting our data is seen and heard every day. That harvesting will continue, unabated. Yes, video games are by no means free of critique (see, for instance, the constant concerns around inciting violent behavior, or exposure of children to gambling-like practices), but they have been less afflicted by critiques regarding data and privacy. While critique of China’s use of games to collect biometric data represents an awareness of how video games might enact surveillance, it is only one such example of 100s of examples. And knowing the regulators and politicians are still arguing about yesterday’s (yesteryear’s?) technology threat, video game makers know their mechanisms for data extraction and accumulation can continue for a long while before addressed.

Not everybody is resigning themselves to the fact that many games vacuum up our data. Platform alternatives to game development such as Twine or Bitsty show that it’s possible to resist the data capture imperative of engines like Unity. Where Unity’s large user-base serves to power the company’s ad network, software like Twine use its network of users to create a collaborative community developing games that directly contravene accepted video game conventions.

But they are in a minority. To really renegotiate the terms on which we play video games would be a monumental task and nobody has the will for it.

Last month, Take-Two made history by buying Zynga for $12.7 billion, only to be dwarfed immediately by Microsoft, which purchased Activision Blizzard for a record $68.7 billion in an all-cash deal that used up about half of the tech company’s cash on hand. An announcement from Sony followed soon after, with the Japanese electronics and entertainment company acquiring “Halo” creator and “Destiny” developer Bungie for $3.6 billion. Now, market analysts and industry veterans say nearly any gaming studio could be subject to a similar deal.

The Microsoft-Activision Blizzard deal comes at an interesting time for antitrust regulation. I covered this deal in detail in this post and noted the acquisition was wickedly strategic and shows the conundrum of antitrust regulation.

Antitrust concerns arise over whether a company has monopolistic control over a given market, rather than the dollar amount of a deal. Still, of all the deals, antitrust lawyers say the record $68.7 billion Microsoft deal will draw the most federal scrutiny. Microsoft is one of the world’s largest companies, valued at $2.6 trillion, and Activision is one of the most prominent gaming firms in the world with a roughly $62 billion market capitalization. AT&T and Time Warner’s $85 billion merger faced immense legislative challenges and ultimately passed in 2018 after a protracted two-year battle with the Justice Department, which sought to block the deal.

On the same day the two gaming goliaths announced the deal, the FTC and DOJ launched a joint public inquiry with the goal of better detecting and preventing anti-competitive deals. Shortly after, Bloomberg reported the FTC had assumed responsibility for reviewing the Microsoft and Activision Blizzard deal. The FTC declined to comment or confirm an existing investigation. The stock market appears to be reacting to the looming specter of this investigation. Activision is trading at $80/81, and the purchase price is at $95. Investors think there’s a pretty good chance that the deal gets blocked. Otherwise the stock would be trading at $92 or something like that. 

But there is a bigger issue afoot. Yes, the gaming industry is consolidating. But Big Tech’s newfound pivot (especially Microsoft’s) toward Netflix-like subscription models means all of these services will require content. For now, back catalogues make up the bulk of video game subscription service libraries. But as the arms race heats up, companies will need fresh content, thereby necessitating the purchase of studios and libraries and walled gardens and private 1st party data silos.

We are hurtling towards an advertising ecosystem that doesn’t rely on cookies – Big Tech + the big publishers are making huge plays in the data space (see the Spotify/Joe Rogan imbroglio, for instance) to ensure their 1st party data is as rich as possible. Hence Facebook’s launch of an e-commerce platform aimed at helping business build a digital presence – while at the same time gaining access to even more 1st party user purchase data.

Walled gardens such as Amazon, Apple, Facebook, and Google – and now Microsoft and TikTok – are leading the way when it comes to ensuring their advertising revenue continues to grow, especially when we eventually say goodbye to 3rd party cookies. Both Google and Facebook are positioned to benefit by further forcing advertisers to use their 1st party data and ad buying tools if they want to continue reaching their audience at scale.

Next week I’ll address the Zuckerberg crying game of “Oh-Apple-is-killing-our-advertising-business!” People are missing the shift. But oh those “Apple-is-killing-Facebook” click baits are sure fun to ReTweet! What has happened is the collapse of the traditional marketing funnel that Google and Facebook’s advertising products were mapped onto and Apple is just one part. There is a bigger picture to understand. 

Giant duopolies aside, the next few years will be very important for many publishers still selling their own advertising inventory and Spotify has drawn a very clear line in the sand between themselves and other podcast hosting platforms. This Joe Rogan deal has marked the beginning of an aggressive push that allows serving your own adverts within these programmes, something they could not do before.

Podcasting advertising in still in its infancy, relatively speaking. But it is a brilliant model that will be adopted by every platform. Advertisers finally can put data at the beginning, middle and end of their podcasting campaign – allowing them to make informed decisions and set goals based on who clicked, who skipped, who eventually bought the product rather than relying on prehistoric metrics such as offer codes and affiliate payments. These improvements in back end ad tech in podcasting can be applied across all media. And it allows platforms to scale their ad revenue incredibly quickly and have exclusive content (such as Spotify’s “The Joe Rogan Experience”) to provide a very solid foundation to do this.

And so, the elephant in the room: how much might advertising and data retreat inside silos (Apple, Facebook, Google, Spotify, TikTok, etc.) where nothing is passed around or shared, and most of these data privacy regulations will simply will not apply?

 

Leave a Reply

Your email address will not be published. Required fields are marked *

scroll to top