Data, Privacy, & Surveillance

How data becomes power (and what to do about it)

Privacy Audit

In pairs, pick one app you both use:

  • What data does it collect? (Check privacy policy or app permissions)
  • What's it used for? (Ads? Features? Sold to third parties?)
  • What would change if you opted out?
  • Is the trade-off worth it to you?

Prepare to share one surprising finding.

The Bargain We Made

  • Free services in exchange for your behavioral data
  • The pipeline: collection → aggregation → analysis → prediction → monetization
  • Your data alone isn't worth much. Combined with everyone else's, it is.
  • The product isn't the data itself, it's predictions about what you'll do next

"If you're not paying, you're the product" - but even when you pay, you're still the product

How Data is Collected

Method Example How aware you are
Explicit Forms, account creation High
Behavioral Clicks, searches, time on page Medium
Inferred Predictions from patterns Low
Third-party Data brokers, tracking pixels Very low

Data brokers have roughly 1,500 data points on every US adult

What is Privacy?

  • Secrecy: hiding information ("I have nothing to hide")
  • Control: deciding who sees what, when, and why
  • Contextual integrity: information flowing where you'd expect it to
  • Power: the ability to shape your own narrative

"Arguing you don't care about privacy because you have nothing to hide is like arguing you don't care about free speech because you have nothing to say." - Snowden

A Brief History of Privacy

Era Privacy Concern Response
1890 Photography, tabloids Warren & Brandeis: "The Right to Privacy"
1960s Government databases Fair Information Practices
1970s Credit reporting FCRA, Privacy Act
2000s Internet tracking COPPA, sectoral laws
2010s Big data, social media GDPR, CCPA
2020s AI, biometrics ?

Privacy gets redefined with each new technology

Privacy Frameworks Today

Framework Model Key Feature
GDPR (EU) Opt-in Must consent BEFORE collection
CCPA (California) Opt-out Can stop collection after it starts
VCDPA (Virginia) Opt-out Data minimization + consumer rights
US Sectoral Context-specific HIPAA (health), FERPA (education)

The problem: US law is a patchwork; EU law sets a floor. Neither covers AI inference.

State Surveillance

  • Reason: National security, law enforcement, public safety
  • Tools: Communications intercept, location tracking, biometrics, predictive policing
  • Limits: Legal (warrants, oversight) and technical (encryption)
  • Danger: Safety tools can become control tools

"Surveillance infrastructure rarely shrinks. Powers granted for one crisis get normalized."

Corporate Surveillance

  • Reason: Personalization, fraud prevention, service improvement
  • Tools: Behavioral tracking, device fingerprinting, data broker networks
  • Limits: Terms of service (unread), privacy laws (with loopholes)
  • Danger: Data collected for ads gets subpoenaed, hacked, or sold

Corporate and state surveillance are merging

Case: Clearview AI

  • Scraped 30+ billion photos from social media without consent
  • Built facial recognition database sold to law enforcement
  • Without oversight, accuracy requirements, or ways to fix errors
  • Banned in Australia, UK, France, Italy. Still operating in the US.

Question: If you post a photo publicly, have you consented to permanent facial recognition surveillance?

Discussion: Where's the Line?

  • Should police access corporate data without a warrant?
  • Should companies collect data they might never use?
  • Should you have the right to be "forgotten" by databases you never joined?

The Surveillance Bargain

What You Get What You Give Up
Convenience Autonomy
Security Privacy
Free services Data about yourself
Personalization Control over your narrative

Is this a bargain you would consciously make?

Chilling Effects

Surveillance changes behavior even if you've done nothing wrong:

  • Self-censorship: avoiding certain searches, conversations, associations
  • Conformity pressure: knowing you're watched encourages blending in
  • Power asymmetry: they know everything; you know nothing about them
  • Vulnerability: any data collected can be weaponized later

"The goal of surveillance isn't to catch everyone. It's to make everyone act as if they might be caught."

Building Privacy Resilience

What you can do:

  • Use privacy tools (VPNs, encrypted messaging, browser settings)
  • Exercise data rights (CCPA/GDPR requests)
  • Be intentional about what you share

What needs to change:

  • Support comprehensive privacy legislation
  • Demand transparency from platforms
  • Design systems with privacy as default, not afterthought

Key Takeaways

  1. Data collection creates power asymmetries - those who collect know more than those collected upon

  2. Privacy is about control and power, not secrecy - "nothing to hide" misses the point

  3. Surveillance infrastructure expands - powers granted for one purpose get repurposed

  4. Chilling effects are real harms - behavior changes even without direct enforcement

  5. Resilience requires systemic change - individual action alone is insufficient

Who should have the power to watch?

======================================================================== LECTURE GUIDE: Data, Privacy, & Surveillance ======================================================================== TOTAL TIME: 45-50 minutes (without exercise) | 55-60 minutes (with exercise) PREPARATION CHECKLIST: - [ ] Test the YouTube video link before class - [ ] Review recent privacy news (data breaches, new regulations, AI privacy concerns) - [ ] Have 2-3 backup cases ready (Cambridge Analytica, NSA/Snowden, recent breaches) - [ ] Check if any students have taken GDPR/CCPA requests - invite them to share - [ ] Prepare whiteboard for discussion if needed SECTION BREAKDOWN: 1. Opening Video & Hook: 5 min 2. The Data Bargain: 6 min 3. Defining Privacy: 6 min 4. Privacy History & Frameworks: 6 min 5. State vs Corporate Surveillance: 6 min 6. Case Study (Clearview AI): 3 min 7. Discussion - Where's the Line: 4 min 8. Chilling Effects & Resilience: 6 min 9. Key Takeaways: 3 min 10. Optional Mini-Exercise: 10 min ADAPTATION FOR 30-MIN VERSION: - Skip the history slide - Use only one surveillance type (state OR corporate, not both) - Reduce discussion to 2 min - Skip the optional exercise KEY MESSAGES TO REINFORCE THROUGHOUT: 1. Privacy is about power and control, not hiding wrongdoing 2. The surveillance infrastructure exists - the question is who controls it 3. Individual action is necessary but systemic change is essential ========================================================================

THE BARGAIN (3 min) - This is the deal we implicitly accepted - Most people don't realize the scope of what they traded - The real value isn't individual data points - it's patterns and predictions WHAT TO SAY: "How many of you read the full terms of service before clicking 'I agree'? [pause for laughs] Right. None of us do. But those clicks added up to the most comprehensive surveillance system in human history - and we built it voluntarily." CONCRETE EXAMPLE: "Your single search for 'divorce lawyer' isn't worth much. But combined with your location, purchase history, social connections, and browsing patterns, it becomes a high-value signal. Someone will pay to reach you at that vulnerable moment." OPTIONAL STAT: - Google processes 8.5 billion searches per day - Facebook has 2.9 billion monthly active users - The data from these platforms is worth ~$200 per user per year to advertisers TRANSITION: "So how exactly does this data collection work? Let's break it down..."

COLLECTION METHODS (3 min) - The less visible the collection, the less control you have - INFERRED data is particularly concerning: predicted income, political leaning, health status - You never consented to inferences - they're derived from other data WHAT TO SAY: "Walk through the table from top to bottom. Notice how your awareness decreases as we go down. By the time we get to inferred data, you have no idea what's being collected - because it's being created, not collected." CONCRETE EXAMPLES BY ROW: - Explicit: When you sign up for Netflix and enter your birthday - Behavioral: Netflix knows you binged a show at 3am on a Tuesday - Inferred: Netflix predicts you might be depressed based on viewing patterns - Third-party: A data broker buys that inference and sells it to your insurance company KEY INSIGHT: Most people think of privacy as controlling what they share. But the real issue is what's inferred from what they share. TRANSITION: "This raises a fundamental question: what is privacy, really?"

- Prompt: What data makes this possible? - Prompt: Who benefits, who bears the risk?

DEFINING PRIVACY (3 min) - Privacy is NOT just about hiding wrongdoing - CONTROL: You should decide who accesses your information - CONTEXTUAL INTEGRITY (Nissenbaum): What you tell your doctor shouldn't go to your employer - POWER: Privacy asymmetries create power asymmetries WHAT TO SAY: "When someone says 'I have nothing to hide,' ask them for their phone unlocked. They'll hesitate. That hesitation is privacy - not because they're hiding crimes, but because they want control over their own narrative." FACILITATION TIP: If a student makes the "nothing to hide" argument, use the Snowden quote on screen, then ask: "Do you close the bathroom door? You're not doing anything illegal in there." CONTEXTUAL INTEGRITY EXAMPLE: "You tell your doctor about a health condition. You expect that stays with your doctor. But what if your employer found out? What if your insurance company did? The information is the same - the context changed. That violation of expected information flow is what Nissenbaum calls a breach of contextual integrity." TRANSITION: "Privacy as a concept has evolved over time. Let's look at how..."

HISTORY (3 min) - Privacy law is reactive - it responds to technological change - Warren & Brandeis wrote because cameras let newspapers photograph people without consent - Same pattern repeats: new tech → new harms → new rules (eventually) - We're in another transition moment with AI WHAT TO SAY: "Look at the pattern in this table. Every row follows the same story: new technology enables new privacy violations, people get hurt, eventually we make rules. We're always playing catch-up." WARREN & BRANDEIS STORY: "In 1890, portable cameras were new. Newspapers started publishing candid photos of society figures without permission. Two Boston lawyers were so outraged they wrote an article that invented privacy law. Sound familiar? Replace 'cameras' with 'facial recognition' and 'newspapers' with 'Clearview AI.'" KEY INSIGHT: Privacy isn't a fixed concept - it evolves with technology and social norms DISCUSSION PROMPT (optional, 1 min): "What's the 2030s row going to say? What technology will force the next privacy redefinition?" TRANSITION: "So where does privacy law stand today? Let's look at the current frameworks..."

FRAMEWORKS (3 min) - Don't get lost in legal details - focus on the models - OPT-IN vs OPT-OUT is the key distinction - GDPR has real teeth: Meta fined €1.2B in 2023 - US approach leaves massive gaps between sectors WHAT TO SAY: "The key distinction is opt-in versus opt-out. In Europe, companies must ask before collecting. In the US, they collect first and you have to figure out how to stop them - if you even know it's happening." VCDPA (Virginia, 2023) - RELEVANT TO STUDENTS: - Applies to businesses handling data of 100K+ Virginia residents - Consumer rights: access, correct, delete, opt-out of sale/targeted ads - Requires data minimization (only collect what's needed) - No private right of action - only VA Attorney General can enforce - Influenced other state laws (Colorado, Connecticut, Utah, etc.) - "You're Virginia Tech students - VCDPA covers you! Have any of you exercised these rights?" CONCRETE COMPARISON: - EU: "Can we collect your data?" → You say yes → Collection begins - US: Collection begins → You discover it → You opt out (maybe) → Collection continues elsewhere GOVERNANCE QUESTION: Should the US adopt a comprehensive federal privacy law? TRANSITION: "Privacy frameworks set the rules. But who's actually watching? Let's look at surveillance..."

STATE SURVEILLANCE (3 min) - Surveillance is not inherently bad - legitimate security uses exist - But the same tools that catch terrorists can suppress dissent WHAT TO SAY: "I want to be clear: surveillance isn't inherently evil. We want the FBI to catch terrorists. We want police to solve crimes. The question is: what limits exist? What oversight? What happens when those tools are pointed at ordinary citizens?" THE RATCHET PROBLEM: - PATRIOT Act was "temporary" - we're still living with it - Emergency powers become permanent - The question: How do you build security infrastructure that can't be turned against citizens? CONCRETE EXAMPLES: - NSA mass metadata collection (revealed by Snowden, 2013) - Stingray devices: fake cell towers that intercept all nearby phone data - Predictive policing: algorithms flag people as threats before any crime INTERNATIONAL CONTRAST: - China's social credit system: explicit state surveillance for behavior control - US approach: less explicit, more distributed, but still pervasive TRANSITION: "The government isn't the only one watching. Let's look at corporate surveillance..."

CORPORATE SURVEILLANCE (3 min) - Companies often collect more data than governments - Google knows your searches, location, emails, calendar, contacts - Data brokers know your financial status, health conditions, political leanings WHAT TO SAY: "Here's the uncomfortable truth: Google probably knows more about you than the NSA does. And they didn't need a warrant - you clicked 'I agree.'" CONCRETE EXAMPLE - WHAT GOOGLE KNOWS: - Every search you've ever made - Everywhere you've been (location history) - Every email you've sent or received - Your calendar, contacts, photos - Every YouTube video you've watched - Every app you've downloaded Activity: "Go to myactivity.google.com after class. You'll be shocked." THE BRIDGE TO STATE: - Geofence warrants: police ask Google for everyone near a crime scene - Third-party doctrine: government can request data you "voluntarily" gave companies - The surveillance infrastructure exists. The question is who gets access. KEY INSIGHT: "The distinction between corporate and state surveillance is increasingly artificial. When police can just subpoena your data from Google, what's the difference?" TRANSITION: "Let's look at a case that shows where this is heading..."

CASE: CLEARVIEW AI (3 min) - This illustrates the collision of data, privacy, and surveillance - You could be identified from a single photo, no consent, no notification - Errors disproportionately affect people of color - Weaponizable by stalkers, abusive partners, authoritarian states WHAT TO SAY: "Clearview AI scraped every public photo they could find - Facebook, LinkedIn, news sites, everywhere. They built a facial recognition database and sold it to police. You never consented. You were never notified. You have no way to opt out." THE CONTEXTUAL INTEGRITY VIOLATION: "You posted a photo on Instagram for your friends to see. Did that mean you consented to being in a permanent police database? Of course not. The context changed without your knowledge or consent." ACCURACY PROBLEM: - Studies show facial recognition error rates are 10-100x higher for Black faces - Wrongful arrests have already happened (Robert Williams, Detroit 2020) - No accuracy standards, no oversight, no accountability for errors HUMAN SECURITY CONNECTION: - Personal security: stalkers, abusive ex-partners can find you - Political security: authoritarian states can identify protesters - Community security: over-policing of marginalized communities GOVERNANCE QUESTION: "If you post a photo publicly, have you consented to permanent facial recognition surveillance?" - Push for nuance: there's a difference between "publicly accessible" and "consenting to any use" TRANSITION: "This case raises fundamental questions about where we draw lines..."

DISCUSSION: WHERE'S THE LINE? (4 min) Format: 1 min think, 3 min share FACILITATION TIPS: - Don't let students stay abstract - push for SPECIFIC positions - Call on quiet students: "What do you think?" - Play devil's advocate if consensus forms too quickly QUESTION-BY-QUESTION GUIDANCE: Q1: "Should police access corporate data without a warrant?" - Current law often allows this (third-party doctrine) - Push: "What if it's your location data during a protest?" - Counter: "What if it helps solve a murder?" Q2: "Should companies collect data they might never use?" - Current practice: "Collect everything, figure out uses later" - Push: "What's the harm if they never use it?" → "What if they get hacked?" - Counter: "How do you know what you'll need?" Q3: "Should you have the right to be 'forgotten' by databases you never joined?" - EU says yes (GDPR right to erasure). US says no. - Push: "What about data brokers who have your info without you ever interacting with them?" - Counter: "How would this even be enforced?" IF DISCUSSION STALLS: "Let's make it personal. Raise your hand if you'd be comfortable with police accessing your full Google search history without a warrant." These don't have clean answers - that's the point. TRANSITION: "These are hard questions. But before we move on, let's look at what happens when we don't ask them..."

THE SURVEILLANCE BARGAIN (2 min) - Most people never consciously made this choice - It happened incrementally, one "I agree" at a time - The question isn't whether surveillance exists - it's who controls it and for what purpose WHAT TO SAY: "Look at this table. Walk through each row slowly. Ask yourself: if someone had presented this trade explicitly, would you have agreed? Most people wouldn't. But we did agree - we just didn't know what we were agreeing to." REFLECTION PROMPT (optional): "Think about one service you use daily - Instagram, Google Maps, Spotify. What would you give up to keep using it? What wouldn't you give up?" KEY INSIGHT: The bargain was never clearly presented. Informed consent is a fiction. TRANSITION: "Even if you're fine with this bargain personally, there are broader effects we need to consider..."

CHILLING EFFECTS (3 min) - You don't need to arrest everyone - just make people afraid - Self-censorship is the goal; actual enforcement is expensive - This affects journalists, activists, minorities, anyone outside the mainstream WHAT TO SAY: "The most effective surveillance doesn't punish everyone - it makes everyone police themselves. If you know you might be watched, you act differently. That change in behavior is the harm, even if no one ever looks at your data." CONCRETE EXAMPLES - ASK STUDENTS: "Have you ever NOT searched for something because you were worried about how it would look?" - Medical symptoms you were embarrassed about - Political topics you didn't want associated with you - Relationship problems you didn't want to explain RESEARCH: Studies show people change search behavior when they know they're monitored - Fewer searches for sensitive health topics (Pew Research, 2015) - Fewer searches for controversial political views (Alex Marthews study, 2016) - Wikipedia traffic to terrorism-related articles dropped 30% after Snowden revelations WHO IS AFFECTED MOST: - Journalists protecting sources - Activists organizing protests - LGBTQ+ individuals in hostile environments - Immigrants worried about status - Anyone outside the mainstream KEY INSIGHT: "This is the harm - not just data collection, but behavior modification. You become a slightly smaller version of yourself." TRANSITION: "So what can we actually do about this?"

BUILDING PRIVACY RESILIENCE (3 min) - Individual action matters but isn't sufficient - Systemic change requires collective action and policy WHAT TO SAY: "I'm going to give you individual tips, but I want to be honest: individual action has limits. You can't privacy-tool your way out of a system designed to surveil you. That's why systemic change matters." INDIVIDUAL TOOLS - PRACTICAL TIPS: - Signal for messaging (end-to-end encrypted) - DuckDuckGo or Brave for browsing - VPN for network privacy (but choose carefully - some VPNs log) - Review app permissions regularly - Exercise your CCPA/VCDPA rights: request your data, request deletion INDIVIDUAL LIMITS - BE HONEST: - You can't opt out of facial recognition in public - You can't control what others post about you - You can't prevent inference from data you didn't share - Even if you go dark, your network reveals you (shadow profiles) SYSTEMIC CHANGE - WHAT ACTUALLY MOVES THE NEEDLE: - Privacy by design: build protection in, don't bolt it on - Data minimization: collect only what's needed - Purpose limitation: use data only for stated purposes - Accountability: real consequences for violations (not just fines as cost of business) CALL TO ACTION: "As future technologists, policymakers, and citizens, you have agency here. The systems we build today will shape privacy for decades. What will you build?" TRANSITION: "Let's bring this together..."

KEY TAKEAWAYS (2 min) - Read each point slowly, pausing between them - These are the ideas students should walk away with WHAT TO SAY: Walk through each takeaway deliberately: 1. "Data collection creates power asymmetries - those who collect know more than those collected upon. This is fundamentally about power." 2. "Privacy is about control and power, not secrecy - every time someone says 'nothing to hide,' remember: it's not about hiding. It's about controlling your own narrative." 3. "Surveillance infrastructure expands - powers granted for one purpose get repurposed. The PATRIOT Act is still with us. COVID contact tracing infrastructure is still with us. Once built, it doesn't go away." 4. "Chilling effects are real harms - you don't need to be arrested to be harmed. Changing your behavior because you might be watched is itself a harm to your autonomy." 5. "Resilience requires systemic change - individual action alone is insufficient. We need policy, we need design changes, we need collective action." CLOSING THOUGHT: "The question isn't whether you have something to hide. The question is whether you have something you want to control - your health information, your location, your relationships, your thoughts. Privacy is the right to a self that isn't completely transparent to power."

CLOSING QUESTION (1 min) - Leave on screen as students pack up - Don't answer it - let it linger WHAT TO SAY: "I'm leaving you with this question. The answer isn't 'no one' - some surveillance serves legitimate purposes. We want crimes solved. We want terrorists stopped. But the question is: who decides? What accountability exists? What limits? These are the questions you'll be answering as the next generation of technologists, policymakers, and citizens." OPTIONAL FOLLOW-UP: "Before next class, check myactivity.google.com. See what Google knows about you. Come ready to discuss."

MINI-EXERCISE (10 min: 6 work + 4 share) [OPTIONAL] - SKIP THIS if running short on time SETUP (30 sec): - Have students pair up - Pick ONE app they both use (Instagram, Spotify, TikTok, etc.) - Point to the four questions SUGGESTED APPS: - TikTok: extensive data collection, Chinese ownership concerns - Instagram: tracks location, contacts, browsing, purchases - Spotify: listening habits, location, voice data - Venmo: payment history public by default WHERE TO FIND INFO: - iOS: Settings → Privacy → App Privacy Report - Android: Settings → Privacy → Permission Manager - App's privacy policy (search for "data we collect") SHARE-OUTS (4 min): - Ask 3-4 pairs: "What surprised you most?" - Look for patterns: most students don't know what's collected DEBRIEF: "Notice how hard it was to even find this information? That's by design. Informed consent requires information, and that information is deliberately obscured."

======================================================================== POST-LECTURE NOTES ======================================================================== COMMON STUDENT QUESTIONS: "Is it even possible to have privacy anymore?" - Yes, but it requires effort and tradeoffs - Point to tools (Signal, VPNs, privacy-focused services) - Acknowledge systemic limits - individual action isn't enough "What about encryption? Doesn't that protect us?" - Encryption protects data in transit and at rest - But metadata (who, when, where) is often not encrypted - And once decrypted at the endpoint, data can be collected "Why don't companies just... not collect so much data?" - Business model: data = revenue (advertising, selling to brokers) - Competitive pressure: if you don't collect, competitors will - This is why regulation matters - level the playing field "Is [China/US/EU] better or worse on privacy?" - Avoid simplistic comparisons - China: explicit state surveillance, less corporate privacy - US: corporate surveillance dominant, state accesses via companies - EU: strongest privacy law, but enforcement varies CONNECTIONS TO OTHER COURSE CONTENT: - AI/ML lecture: training data, bias, consent for AI training - Cybersecurity lecture: data breaches, security vs. privacy tradeoffs - Economic Security: data as economic asset, surveillance capitalism - Political Security: surveillance of activists, election interference - Human Security framework: how privacy cuts across all dimensions ASSESSMENT CONNECTION: - Case study assignments can analyze specific privacy violations - Final projects can examine privacy implications of emerging tech - Discussion posts can reflect on personal privacy practices RESOURCES FOR DEEPER EXPLORATION: - Book: "The Age of Surveillance Capitalism" by Shoshana Zuboff - Book: "Privacy's Blueprint" by Woodrow Hartzog - Documentary: "The Social Dilemma" (Netflix) - Tool: EFF's Privacy Badger, Cover Your Tracks - Academic: Helen Nissenbaum's work on contextual integrity - Report: Pew Research Center privacy surveys - Exercise: Download your data from Google, Facebook, Apple ========================================================================