This is an open question on how to get the masses to care…
Unfortunately, if other people don’t protect their privacy it affects those who do, because we’re all connected (e.g. other family members, friends). So it presents a problem of how do you get people who don’t care, to care?
I started the Rebel Tech Alliance nonprofit to try to help with this, but we’re still really struggling to convert people who have never thought about this.
(BTW you might need to refresh our website a few times to get it to load - no idea why… It does have an SSL cert!)
So I hope we can have a useful discussion here - privacy is a team sport, how do we get more people to play?
Great cause and one that reaches to the heart of what I see as impacting much of the governmental and societal disruption that’s happening. It’s a complex and nuanced issue that is likely to take multiple prongs and a long time to resolve.
Let me start by again generally agreeing with the point. Privacy is necessary for reasons beyond the obvious needs. Speaking to the choir here on a privacy community. I think it’s worth listing the reasons that I understand why Americans are generally dismissive of the need for privacy protections. I cheated here, and used an LLM to help, but I think these points are indicative of things to overcome.
Convenience > confidentiality. Nearly half of U.S. adults (47 %) say it’s acceptable for retailers to track every purchase in exchange for loyalty-card discounts, illustrating a widespread “deal first, data later” mindset. Pew Research Center
“Nothing to hide.” A popular refrain equates privacy with secrecy; if you’re law-abiding, the thinking goes, surveillance is harmless. The slogan is so common that rights groups still publish rebuttals to it. Amnesty International
Resignation and powerlessness. About 73 % feel they have little or no control over what companies do with their data, and 79 % say the same about government use—attitudes that breed fatalism rather than action. Pew Research Center
Policy-fatigue & click-through consent. Because privacy policies are dense and technical, 56 % of Americans routinely click “agree” without reading, while 69 % treat the notice as a hurdle to get past, not a safeguard. Pew Research Center
The privacy paradox. Behavioral studies keep finding a gap between high stated concern and lax real-world practice, driven by cognitive biases and social desirability effects. SAGE Journals
Market ideology & the “free-service” bargain. The U.S. tech economy normalizes “free” platforms funded by targeted ads; many users see data sharing as the implicit cost of innovation and participation. LinkedIn
Security framing. Post-9/11 narratives cast surveillance as a safety tool; even today 42 % still approve of bulk data collection for anti-terrorism, muting opposition to broader privacy safeguards. Pew Research Center
Harms feel abstract. People worry about privacy in the abstract, yet most haven’t suffered visible damage, so the risk seems remote compared with daily conveniences. IAPP
Patchwork laws. With no single federal statute, Americans face a confusing mix of state and sector rules, making privacy protections feel inconsistent and easy to ignore. Practice Guides
Generational normalization. Digital natives are more comfortable with surveillance; a 2023 survey found that 29 % of Gen Z would even accept in-home government cameras to curb crime. cato.org
Having listed elements to overcome, it’s easy to see why this feels sisyphean task in an American society. (It is similar, but different other Global North societies. The US desperately needs change as is evident with the current administration.) Getting to your question though, I feel like the real rational points to convey are not those above, but the reasons how a lack of privacy impacts individuals.
Political micro-targeting & democratic drift
Platforms mine psychographic data to serve bespoke campaign messages that exploit confirmation bias, social-proof heuristics, and loss-aversion—leaving voters receptive to turnout-suppression or “vote-against-self-interest” nudges. A 2025 study found personality-tailored ads stayed significantly more persuasive than generic ones even when users were warned they were being targeted. Nature
Surveillance pricing & impulsive consumption
Retailers and service-providers now run “surveillance pricing” engines that fine-tune what you see—and what it costs—based on location, device, credit profile, and browsing history. By pairing granular data with scarcity cues and anchoring, these systems push consumers toward higher-priced or unnecessary purchases while dulling price-comparison instincts. Federal Trade Commission
Dark-pattern commerce & hidden fees
Interface tricks (pre-ticked boxes, countdown timers, labyrinthine unsubscribe flows) leverage present-bias and choice overload, trapping users in subscriptions or coaxing them to reveal more data than intended. Federal Trade Commission
Youth mental-health spiral
Algorithmic feeds intensify social-comparison and negativity biases; among U.S. teen girls, 57 % felt “persistently sad or hopeless” and nearly 1 in 3 considered suicide in 2021—a decade-high that public-health experts link in part to round-the-clock, data-driven social media exposure. CDC
Chilling effects on knowledge, speech, and creativity
After the Snowden leaks, measurable drops in searches and Wikipedia visits for sensitive topics illustrated how surveillance primes availability and fear biases, nudging citizens away from inquiry or dissent. Common Dreams
Algorithmic discrimination & structural inequity
Predictive-policing models recycle historically biased crime data (representativeness bias), steering patrols back to the same neighborhoods; credit-scoring and lending algorithms charge Black and Latinx borrowers higher interest (statistical discrimination), entrenching wealth gaps. American Bar AssociationRobert F. Kennedy Human Rights
Personal-safety threats from data brokerage
Brokers sell address histories, phone numbers, and real-time location snapshots; abusers can buy dossiers on domestic-violence survivors within minutes, exploiting the “search costs” gap between seeker and subject. EPIC
Identity theft & downstream financial harm
With 1.35 billion breach notices issued in 2024 alone, stolen data fuels phishing, tax-refund fraud, bogus credit-card openings, and years of credit-score damage—costs that disproportionately hit low-information or low-income households. ITRC
Public-health manipulation & misinformation loops
Health conspiracies spread via engagement-optimized feeds that exploit negativity and emotional-salience biases; a 2023 analysis of Facebook found antivaccine content became more politically polarized and visible after the platform’s cleanup efforts, undercutting risk-perception and vaccination decisions. PMC
Erosion of autonomy through behavioral “nudging”
Recommendation engines continuously A/B-test content against your micro-profile, capitalizing on novelty-seeking and variable-reward loops (think endless scroll or autoplay). Over time, the platform—rather than the user—decides how hours and attention are spent, narrowing genuine choice. Nature
National-security & geopolitical leverage
Bulk personal and geolocation data flowing to data-hungry foreign adversaries opens doors to espionage, blackmail, and influence operations—risks so acute that the DOJ’s 2025 Data Security Program now restricts many cross-border “covered data transactions.” Department of Justice
Social trust & civic cohesion
When 77 % of Americans say they lack faith in social-media CEOs to handle data responsibly, the result is widespread mistrust—not just of tech firms but of institutions and one another—fueling polarization and disengagement. Pew Research Center
And one last point here, is that these all stem from the way we as humans are built. Although we are capable of rational though, we often do not make rational decisions. Indeed those decisions are based on cognitive biases which we all have and are effected by context, environment, input, etc. It’s possible to overcome this lack of rational judgement, through processes and synthesis such as the scientific method. So we as citizens and humans can build institutions that help us account for the individual biases we have and overcome these biological challenges, while also enjoying the benefits and remaining human.