“Roe v Wade” : the culture wars are coming for tech companies. It’s only a matter of time.

The challenges will only grow as people give more information to more services, and devices collect more and more highly personal information that can be remotely accessed and controlled.

 

6 May 2022 – Yesterday I posted America’s new abortion surveillance landscape: another brutal use of technology which explained how data is the big front in the war on abortion. As the line between our digital and physical selves fades, anti-abortion activists are making sophisticated updates to tried-and-true methods of stalking, harassment, and disinformation; and data brokers are having a field day.

My post struck a chord. Over 7,000 of you opened it (28% of my readers) and the read rate – how many people who open your email actually spent time looking at it – was my second highest this year (my Ukraine War pieces still lead) with the TTR (Total Time Reading) averaging 3.8 minutes. You liked it 🙂

The leaked Supreme Court draft opinion overturning Roe v Wade which would end the right to privacy allowing women to get abortions is also serving as a scary wake-up call to companies that have vast stores of user data. Technology companies, including app makers and connected device companies, have a stunning array of information about people housed in their data centers. In many cases, that data is just a subpoena or warrant away from government hands.

In other cases, as I detailed in yesterday’s post, the data is actively harvested and sold to buyers, ranging from governments to advertisers to academic researchers. When you add connected devices into the mix – where remote access to a device and control of that device is possible – companies might be forced to turn over data and turn against customers.

The problem is easy to analyse. Without decisive regulations on what data should be considered private (please  do not quote me the data protection laws of California or Connecticut or whatever state. Those laws do nothing. You can drive a truck through the loopholes), or how and who can access data, or even clear rules about ownership and rights that ensure a buyer has physical control of a purchased device – well, then, technology firms are creating new windows into their users’ lives and then allowing the government peer in. In the case of connected devices, they may even let the government open the window.

If you think I’m overstating the potential for damage that could occur as technology firms and the state collide, consider some recent examples:

• After Russian troops stole $5 million worth of agricultural equipment from a John Deere dealership in Ukraine, the company apparently bricked the combines and tractors remotely, turning them into bright green and yellow agricultural sculptures. In this case, John Deere was simply remotely controlling inventory stolen from its dealership. But I can see a future where a government pressures John Deere to remotely deactivate customer equipment if it was owned by a wanted criminal, or if doing so would change the direction of an armed conflict. In the meantime, in Congress, they are debating as part of the current U.S. infrastructure bill whether or not new cars should have some kind of kill switch so that police could stop a car chase. If you aren’t following these debate and committee hearings (or have a media team to do it for you) it has been fascinating.

• The Centers for Disease Control and Prevention (CDC) spent $420,000 to buy location data on millions of Americans from a private company (SafeGraph; see below). The CDC wanted to see if people were following COVID curfews but also wanted to track neighbor-to-neighbor visits, visits to church and pharmacies, and other related activities during the pandemic. Much of that data collection could be related to the agency’s auditing of lockdowns to better determine their effectiveness, but the CDC is also using location data for other more nebulous programs. Additionally, because location data can be tied back to addresses where a person lives or works, it’s not anonymous, which means the CDC could theoretically identify the people breaking curfew or quarantine and get law enforcement involved.

• SafeGraph (the CDC’s new best friend) routinely sells data for very low cost amounts – the company made famous this week due to news about the reams and reams and reams of location data it has on people visiting abortion providers, as I detailed in yesterday’s post – said “Ok, you got us. We’ll stop selling that data”. Except that  researchers say that data is still on the market, and not just through SafeGraph. As abortion becomes illegal in certain states, those states are now easily buying data to figure out exactly who may have accessed health care services – and punish them.

This underscores the reasons so many have called for technology companies to hold less data, and to hold it for less time – and highlights why data such as location data is almost impossible to monitor much less control and provides the biggest threat. That data can most easily identify an individual, offering a plethora of data on so many levels of Personally Identifiable Information, or PII.

NOTE: PII is a broad term that is defined not by specific pieces of data, but by how data can be used to distinguish or trace an individual’s identity, either alone or when combined with other information that is linked or linkable to a specific individual. I covered a bit of that in yesterday’s post and next week my team will publish a tutorial on why location data makes it so easy.  

Yes, there are several different regulatory frameworks for dealing with PII, laws that govern how companies use it but based on industry. So it varies. But the fact is that location data escapes all of those PII regulatory frameworks so we are left to relying on tech companies collecting it to treat it as such, and protect it. But they do not. Today it is treated as an asset, sold to companies seeking demographic data or other information.

NOTE: one suggestion always out there is that consumers can still enjoy location-based apps, but they need to pay for them so that tech companies do not subsidize them by data sales. Sure. That will fly.

Worse is that law enforcement agencies, knowing they do not have a legal right to certain data so no power to command or subpoena, merely go around that restriction and buy the data they need from third parties. So we either need to eliminate that option or create rules around how, exactly, law enforcement can use privately purchased data. But relax. Given the commitment and intensity of regulators to get hold of these data protection issues to protect its citizens no doubt we’ll soon see … oh, hold on.

On a serious note, these challenges will only grow as people now give more and more information to more and more services, and devices collect more and more highly personal information that can be remotely accessed and controlled. And now, without laws governing data use and collection, tech firms will become complicit in laws such as prosecuting women for having abortions or outing LGBTQIA+ students to their families, which their employees and many of their users might find unfair. And that’s not a fun (or profitable) position for them to be in. But it is becoming more and more uncontrollable.

Leave a Reply

Your email address will not be published. Required fields are marked *

scroll to top