The 23andMe saga: A cautionary tale about the need for AI data privacy laws
Oct. 1, 2024 — There was a time not long ago when the ancestry tests offered by 23andMe were the happening thing. The Silicon Valley-based company sold test kits that allowed consumers to spit in a tube and send in the sample for a DNA test that would reveal their ancestral heritage.
The kits sold in the millions and were often given as birthday or holiday gifts. Find out if you’ve got some unknown ethnicity in your background: Super fun!
Only…when consumers volunteered their DNA, they were handing over their most personal digital data to a private corporation.
Even in the heyday of 23andMe in the 2010s, some cautionary voices raised concerns. “The problem is that when you send away a tube of your spit or a cheek swab, you are giving away your full genetic code,” wrote Maggie Fox of NBC News.
“It’s the most valuable thing you own,” said Peter Pitts of the Center for Medicine in the Public Interest.
Your DNA as a digital asset for AI developers
Today, 23andMe stands on the verge of bankruptcy. Kristen V. Brown lays out the situation in The Atlantic:
23andMe is not doing well. Its stock is on the verge of being delisted. It shut down its in-house drug-development unit last month, only the latest in several rounds of layoffs. Last week, the entire board of directors quit, save for Anne Wojcicki, a co-founder and the company’s CEO. Amid this downward spiral, Wojcicki has said she’ll consider selling 23andMe—which means the DNA of 23andMe’s 15 million customers would be up for sale, too. 23andMe’s trove of genetic data might be its most valuable asset.
As Brown points out, 23andMe isn’t bound by HIPAA, the federal health privacy law. “The company’s privacy policies make clear that in the event of a merger or an acquisition,” she writes, “customer information is a salable asset.”
At this point it’s hard to imagine a scenario in which 23andMe’s trove of genetic data isn’t sold to a data broker or an artificial intelligence developer. AI companies, as has been well documented, are thirsty for fresh data to train their models. At 38 cents per share, 23andMe has a market capitalization of roughly $192 million. That’s pocket change for AI developers like Microsoft, OpenAI, and Meta. Microsoft alone has $75 billion in cash on hand.
The risk of zero regulation
As with so much in AI, there’s potential for both amazing discovery and profound harm here. 23andMe’s genetic database, used as AI training data, could eventually produce life-saving medical discoveries. But it might also be used to deny health insurance to millions of people, reject their job applications, or inspire digital blackmail scams.
At the Transparency Coalition we’re working to create AI guardrails that foster positive innovation while protecting individuals and society against potential harm. That’s why we focus on education and advocacy in the area of digital privacy as a critical element of AI transparency. Individuals shouldn’t just have the legal right to remove their personal data from AI training datasets—we believe personal data used to train AI models should require affirmative opt-in consent from the individuals who own that data.
The 15 million consumers who mailed off their packets to 23andMe did so with the understanding that they were initiating a one-time purchase: Their dollars in exchange for a DNA-based ancestry report. Years later they are learning that their most personal, most private information—the very genetic code that makes them them—has been turned into a commodity to be sold on the open market to the highest bidder.
That’s not right. And that’s why we’re fighting for positive change.