7

Apple’s fertility tracking watch and the post-Roe world.

 1 year ago
source link: https://slate.com/technology/2022/09/apple-privacy-fertility-tracking-roe.html
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Apple Is Using Its Reputation for Protecting Privacy to Invade Your Privacy

Sept 23, 20228:00 AM
Unlocked Apple logo with a spying eye inside of it.
Photo illustration by Natalie Matthews-Ramo/Slate. Logo by Apple.

It checks whether you’re swimming the backstroke or with a kickboard. It checks if you’re buying spaghetti at the grocery store. It also checks for temperature changes to estimate when you’re ovulating. At least, that’s how Apple advertises its newly released Apple Watch Series 8—the latest iteration in the company’s smart wearable devices.

Apple has built its brand on privacy, and there are many good reasons for this perception. Years ago it built end-to-end encryption into iMessage, for example, which ensures only the sender and receiver can see a message. It famously touted its security focus when it refused to help the FBI open the San Bernardino shooter’s iPhone (although it maintains an FBI relationship). And unlike Google, Facebook, and some other tech giants, it doesn’t make most of its revenue from online advertising.

Advertisement

But having a pro-privacy brand doesn’t eradicate real privacy questions. If the company is collecting deeply invasive data in the first place, then corporate goodwill is the only thing preventing abuse—and corporate goodwill, whether Apple’s or that of any other company, can only go so far.

Smart watches and menstrual cycles are hardly the first time this problem has arisen. In April 2021, Apple released the AirTag, a portable location tracker that it said “helps keep track of and find the items that matter most with Apple’s Find My app.” The company put the product’s privacy and security features front and center. The second sentence of Apple’s press release stated that AirTags can “help locate a lost item, all while keeping location data private and anonymous with end-to-end encryption.” Kaiann Drance, Apple’s vice president of worldwide iPhone product marketing, emphasized the AirTag’s “built-in privacy and security features.” Protecting users’ information was portrayed as part and parcel of the company’s ethos.

Immediately after Apple released AirTags, however, data privacy experts warned that stalkers could weaponize the product. “AirTags are easy to hide in a target’s bag or car,” Albert Fox Cahn and Eva Galperin wrote, “giving an abuser an easy way to track their location.” Apple had created a mechanism to warn people with iPhones about unknown AirTags in their vicinity, the authors wrote, but if you don’t check your iPhone, or if you don’t have an iPhone (like half of America), you’re out of luck. The sound AirTags make when separated from their owner for 72 hours, Cahn and Galperin added, was only 60 decibels, “about the same volume as a dishwasher or casual conversation.” Around the same time, a Washington Post reporter tested all this by allowing a colleague to follow him—and found the audible alarm unnoticeable.

Apple responded in July 2021 by changing the beep interval, so AirTags would make a sound if apart from their owner’s device for more than eight to 24 hours. (The beep would be timed randomly in that window.) It also said it would develop an Android application people could download to look for unknown AirTags nearby (which it did, called Tracker Detect). But the grave risks persisted: As Galperin pointed out to the Post, someone could simply use a person’s own AirTag to stalk them, which would not trigger the iPhone alerts. An abusive individual with access to a person’s phone could also simply turn off “item safety alerts” in that person’s iPhone settings. Defending its product, Apple kept leaning into its privacy narrative. In February it said “we design our products to provide a great experience, but also with safety and privacy in mind” and noted that it reminds people when they set up an AirTag to not stalk others (literally).

Despite this stern warning, numerous women have reported people stalking them with AirTags since the product’s launch. In many cases, this involved an ex or person already familiar with their life, which also increases the likelihood an abusive individual has access to a targeted person’s device. For all that Apple might cooperate with law enforcement on investigations into AirTag-related stalking, the product remains on the shelf, and serious privacy and physical safety threats persist.

Advertisement

Apple’s recent decision to embed fertility tracking in the Watch Series 8 has a similar flavor of leaning on a pro-privacy brand to sideline real privacy concerns. The watches, according to the company, will track people’s body temperatures through two sensors to predict ovulation. Users who already track their cycle through the iPhone’s Health app or the Apple Watch’s Cycle Tracking App can also get menstrual cycle deviation alerts. Executives have pitched the monitoring technology as a pregnancy support system, too: “If you’re trying to conceive, knowing if and when you ovulated can inform your family planning with your health care provider,” Sumbul Desai, Apple’s vice president of health, said at its launch event.

The fertility and cycle tracking both come with some privacy and security protections, such as encrypting health data synced to iCloud both in transit and on servers. Apple highlighted this fact when debuting its new devices, and that is certainly a good move. [Update, Sept. 24, 2022: After publication of this article, Apple said in an email that that phones locked with a passcode encrypt all Health app fitness and health data (but not the user’s Medical ID), and that if you use Cycle Tracking with two-factor authentication enabled, health data synced to iCloud is end-to-end encrypted, where Apple does not have the decryption key and cannot read it.] But it doesn’t change the fact that a multinational technology company—with, by some estimates, more than 100 million watch users worldwide—is proposing tracking period information for people with the capacity to get pregnant. It doesn’t change the fact that said company is doing so, and thought it was a good idea to do so, following the overturning of Roe v. Wade in the United States and an even faster-growing surveillance threat to women and other people. Some Apple customers might enjoy this ovulation-tracking feature, and they might choose to use it even with knowledge of the collected data’s sensitivity. At the same time, companies have struggled with responding to the Dobbs decision overturning Roe v. Wade and with handling questions about potential law enforcement requests for pregnancy-related data. Despite this uncertainty, Apple is rolling out a tracking feature that might exacerbate these risks.

Apple’s investments in privacy and cybersecurity protections are better for users than little investment at all, and the company has been more vocal than some peers about privacy issues. But these technical measures don’t eliminate the always-present risk that a company is hacked—and highly sensitive information that it didn’t have to collect in the first place, like menstrual health data, is leaked and exploited. (While the Apple Watch appears to allow users to keep health data on their device, people with both a watch and an iPhone will almost guaranteed want to sync the data through iCloud, especially given the iPhone’s frequent prompts for users to make iCloud backups.) Privacy brand narratives and cybersecurity investments also don’t eliminate the risk that law enforcement agencies, such as those enforcing anti-abortion laws, approach a company with legal demands to access health data. It also doesn’t change the fact that highly sensitive information about a person’s body is collected in the first place—a step often glossed over when the public and policy conversation takes collecting data as a given, and spends the rest of the discussion arguing over how to store, transfer, and use it.

Narratives of “it’s fine if we collect it, because we’re pro-privacy” may be grounded in differences between how that company and its competitors handle users’ information. But it obscures the fact that harms can still occur, that companies don’t have to collect data just because they find it financially lucrative, and that people are still relying on corporate goodwill to prevent data abuse. Decisions like tracking people’s menstrual information still generate serious risks—and a corporate privacy brand doesn’t change that fact.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK