US Healthcare Marketplaces Shared Citizenship + Race Data With Ad Tech Giants

Multiple U.S. healthcare insurance marketplaces have been sharing citizenship status, race, and other sensitive demographic data with major ad tech companies including Google, Meta, and TikTok, according to a TechCrunch investigation published yesterday. The disclosed data flows — which include healthcare.gov state-equivalent platforms in California, New York, and several other states — represent one of the most significant healthcare-privacy concerns disclosed publicly in 2026 and likely violate both HIPAA-adjacent privacy expectations and state-level health data protection laws.
The investigation, conducted with cooperation from privacy researchers at Mozilla and EFF, found that healthcare marketplace web pages embedded standard ad-tracking pixels and JavaScript SDKs from major ad networks. These trackers received URL parameters and form-submission data that included sensitive demographic markers — citizenship status, ethnicity, ZIP code, age, and in some cases income range. The data sharing happened automatically as part of standard tracker installation, not through any explicit consent flow with users.
What was actually shared
Three categories of data flowed to ad tech companies. First, identity-correlated demographics: when a user submitted a marketplace eligibility form, the URL parameters (which appeared in standard tracker requests) contained submitted citizenship status, race/ethnicity selections, and in some cases medical-condition disclosures. These data points reached Google Analytics, Meta Pixel, and TikTok's tracking infrastructure. Second, insurance-shopping behavior: which plans a user viewed, which plan tiers they considered, and which medical-coverage categories they prioritized. Third, session-level data: time on page, navigation patterns, return visits — standard behavioral data that, when combined with demographic data, produces highly identifiable user profiles.
The systems affected reportedly include healthcare.gov (operated by the federal government for states without their own marketplaces) and at least 11 state-operated marketplace sites including Covered California, NY State of Health, Pennie (Pennsylvania), and Maryland Health Connection. An estimated 30+ million U.S. residents may have had sensitive data shared based on aggregate marketplace traffic estimates.
Legal and regulatory implications
The disclosure puts these marketplaces in direct conflict with multiple regulatory frameworks. HIPAA's "designated record set" provisions require explicit business associate agreements with any third party receiving protected health information; standard ad tracker integrations don't meet those requirements. State-level laws — particularly California's CCPA and CPRA, plus several state HIPAA-equivalents — create additional liability exposure. The Office of Civil Rights at HHS has reportedly opened preliminary inquiries into the federal healthcare.gov implementation.
The political angle is especially sensitive given the citizenship-status data. U.S. citizenship status combined with healthcare marketplace usage produces immigration-relevant data profiles that could plausibly be used for enforcement actions if obtained by federal immigration authorities. While the immediate sharing was to commercial ad tech rather than immigration enforcement, the data flow creates structural vulnerability that civil-rights organizations are actively flagging.
My Take
This is a serious institutional failure that wasn't caused by malicious intent — it was caused by insufficient privacy review of standard web infrastructure. Healthcare marketplaces deployed the same Google Analytics, Meta Pixel, and TikTok tracking SDKs that every other website uses, without specifically auditing what data flows to those trackers via URL parameters and form-submission mechanics. The result is mass-scale unintentional disclosure of sensitive demographic data to commercial ad infrastructure.
The remediation is straightforward but politically uncomfortable: healthcare marketplaces should not embed third-party ad trackers, period. Standard analytics can be replaced with privacy-respecting alternatives (Plausible, Fathom, or self-hosted solutions); user-acquisition tracking can use server-side attribution rather than client-side pixels. Both alternatives produce slightly worse marketing analytics in exchange for substantially better privacy posture.
The deeper structural lesson is that privacy-by-default on government health platforms requires explicit prohibition of third-party trackers, not just compliance review. Compliance review caught the obvious violations but missed the ad-tech-tracker pattern entirely. Future government health platform deployments should default to "no third-party trackers, period" rather than allowing trackers contingent on compliance review.
What this means for healthcare privacy and ad tech
Three implications. First, expect HHS Office of Civil Rights enforcement actions within 90 days against the most clearly affected marketplace operators. State attorneys general are likely to follow with parallel investigations. Second, expect major ad tech companies to face renewed scrutiny over their handling of healthcare-adjacent data — Meta has previously been sanctioned for similar issues, and Google's healthcare AI initiatives create additional liability surface. Third, expect government health platform vendors to face new procurement requirements excluding any third-party tracker infrastructure.
For affected individuals, the practical recommendation is to treat any data submitted to U.S. healthcare marketplaces over the past 24+ months as potentially shared with commercial ad infrastructure. Identity monitoring, ad-targeting opt-outs, and where applicable, formal complaints to OCR are reasonable responses. Civil rights organizations are likely to organize coordinated complaint campaigns over the next 30-60 days.
Frequently Asked Questions
What data was shared?
Citizenship status, race/ethnicity, ZIP code, age, income range (in some cases), insurance shopping behavior, and session-level data. Shared automatically via standard ad tracker integrations rather than through explicit user consent.
Which platforms are affected?
Healthcare.gov (federal) plus at least 11 state marketplaces including Covered California, NY State of Health, Pennie (Pennsylvania), and Maryland Health Connection. Estimated 30+ million U.S. residents potentially affected.
Is this illegal?
Likely violates HIPAA's "designated record set" provisions, state-level CCPA/CPRA equivalents, and several state HIPAA-equivalent laws. Federal regulatory inquiry is underway.
What should I do if I used these platforms?
Treat your submitted data as potentially shared with commercial ad infrastructure. Consider identity monitoring, ad-targeting opt-outs (Google's Ad Settings, Meta's Off-Facebook Activity, TikTok's ad preferences), and filing a formal complaint with the HHS Office of Civil Rights if you have specific privacy concerns.
The Bottom Line
U.S. healthcare marketplaces sharing citizenship and race data with ad tech companies is a serious institutional privacy failure affecting 30+ million Americans. HHS OCR enforcement and state AG investigations are likely over the next 90 days, with structural remediation requirements for government health platforms that should permanently exclude third-party tracker infrastructure.
Related Articles
- Instructure Canvas Data Breach
- OpenAI Staff Flag Violence-Reporting Failures
- Maryland Bans AI Grocery Surveillance Pricing