Incomplete List of Information Privacy Properties
- Control/consent, e.g. Cambridge Analytica exfiltrating FB users' data
Anonymity, e.g. Snowden leaking NSA docs without revealing his identity
Limits on data collection, e.g. laws restricting government surveillance
Limits on data use, e.g. US Genetic Information Nondiscrimination Act of 2008
Under GINA, health insurers must not use genetic information of the clients (or clients family) to inform their policy. The military can though, and other insurance like long-term care, life and disability are exempt. Employers (except the military) are prohibited from making decisions based on genetic information. As of 2020, in Florida, long-term-care, life and disability insurers are no longer exempt.
Right to be left alone, e.g. spam as a privacy problem
Avoid “creepy” tracking, e.g. Target’s inference of pregnancy from shopping records. Theories such as Contextual Integrity try to make the notion of creepiness more precise.
Example contextual information norm: US residents [Data Subject, Sender] are required by law to file tax returns [Information Type] with the IRS [Recipient] under conditions of strict confidentiality [Transmission Principle]. The IRS shouldn’t provide Alice’s tax returns to a journalist because that violates the transmission principle under which the data was obtained.
Real-life example. App Annie, an alternative data provider, was charged with securities fraud by the SEC. The apps analyzed by App Annie were promised that their data would be aggregated and anonymized before being used by a statistical model to generate estimates of app performance. These estimates were sold to hedge funds, who used them to inform trading decisions. However, to improve bad estimates from the model, App Annie had some engineers manually alter estimates to be closer to the actual app’s performance metrics. App Annie settled for $10m.
Because privacy is a social construct and means different things in different contexts, it’s hard to distill a set of universally applicable privacy properties (analogous to Confidentiality, Integrity and Availability for Security).
Limits of the CS Approach to Privacy
It is hard to formalize some privacy properties, e.g. limits on data use. It’s even harder to enforce them technically.
There’s a misalignment of incentives between the communicating parties.
So far, the focus has been on a narrow set of threat models, e.g. little/no research on privacy in domestic contexts.
The above limitations make legal approaches to protecting privacy very important.
Privacy in US Law
Constitutional law: 4th amendment (search & seizure)
Sectoral laws: HIPAA (1996), VPPA (1988), GLBA (1999)…
HIPAA empowers patients with rights to their health information and sets rules and limits on who can access it. VPPA makes “video tape service providers” liable for up to $2,500 for disclosing rental information outside ordinary course of business. GLBA allowed commercial banks, investment banks and insurance companies to consolidate, but the consumers need to opt into any change in the information use and sharing agreements.
- Laws applying to government actors: ECPA (1986), FISA (1978)
Regulations by federal agencies (FTC, FCC, …)
State laws: CalOPPA (2015), CCPA (2018), data breach laws…
CalOPPA mandates commercial websites to include a privacy policy, and stipulates additional requirements for handlers of personally identifiable information (PII) from California residents. CCPA mandates businesses that collect consumers' personal data to allow California residents to access, manage and opt out of their data being sold.
Anti-hacking laws (CFAA (1986), …)
Breach of contract
There’s fair criticism that these laws are outdated, or provide loopholes that void privacy guarantees.
For a more detailed treatment, see WWS 586F / COS 586: Information Technology and Law.
[Oversimplified] Comparison of US and EU Approaches
The US has no comprehensive federal privacy statute, while the EU has the 1996 privacy directive (GDPR)
The US has a market perspective, while the EU has a human rights perspective.
The US has relatively string enforcement, while the EU’s enforcement is decentralized and somewhat inconsistent.
Discussion on Cambridge Analytica