Meta point: Managing the trade-off between privacy and safety, or between privacy and innovation.
The Wild West of Data Collection
US govt developed the Clipper Chip for backdoor access (access that’d be granted under court order) to encrypted consumer electronics. Due to public criticism, the Clinton administration abandoned the project. In the wake of 9/11, the USA PATRIOT Act was passed enabling sweeping programs to collect communications without judicial authorization. Snowden’s leaks prompted some reforms, but the public had already been desensitized a bit.
Although the government is responsible for security, and has stricter data handling laws, it faces more criticism than companies who do the same. This might be because companies provide direct tangible benefits, e.g. access to information, connecting to friends, etc. Furthermore, while you can switch between companies, you can’t easily switch your government.
Sometimes companies ship default privacy decisions that omit the broader implications. For example, Google Buzz (2010) shipped with publicly available contacts. One user had her abusive ex-husband find comments posted to her boyfriend, including current location and workplace.
A Digital Panopticon?
Bentham proposed the prison panopticon (allows watchman to observe prison’s occupants without their knowing whether they’re being watched) as a way of using fewer guards while increasing security because of easy surveillance. Foucault in the 1970s revisited the idea in the context of modern civilian surveillance that amounts to social control.
The value of privacy is not to hide illicit behavior, but to have control over oneself and information about oneself, e.g. insurance companies not having our medical records, exercising freedoms like protesting without fear of reprisals, etc.
From the Panopticon to a Digital Blackout
E2E encryption in messaging apps form a technical barrier, as opposed to a policy one, for LEO trying to read message contents. However, these same apps are also used by people with malicious intent!
Technology Alone Won’t Save Us
Differential Privacy is a better alternative to PII-scrubbing that enables beneficial analysis of data without compromising an individual’s privacy. That said, information can be leaked from repeated queries to the system using DP.
We Can’t Count on the Market Either
Myth 1: The market can deliver a diversity of comparable products with different privacy settings.
Apple famously declined installing an iPhone backdoor for the FBI in the wake of the San Bernadino shooting. However, Apple is in the high-end product business and not in the user data monetization business, and can therefore choose to stand for privacy.
DuckDuckGo’s privacy-sensitive search model has not won much share of the search market. For example, before EU’s 2018 antitrust action, Google made it hard for users to not use Google Search on Android.
A Privacy Paradox
Myth 2: People have informed preferences about the amount of privacy they want, and are able to act on these preferences.
Potential harms from giving up privacy are intangible/invisible to most, and therefore weigh less on our minds.
Westin found that people consider themselves privacy fundamentalists, but in practice, are privacy pragmatists and/or privacy unconcerned, e.g. opting into frequent-buyer programs for discounts.
The lack of intrinsic opinions makes privacy preferences more context dependent, e.g. 9/11, Snowden, COVID-19.
Companies systematically capitalize on people’s amorphous privacy preferences, e.g. privacy-invasive defaults, dark patterns, incomprehensible ToS, etc.
Protecting Privacy for the Benefit of Society
Myth 3: We can achieve both our personal and societal goals via a set of decentralized decisions we each make about how much privacy we want.
Apple and Google built privacy features into their contact tracing APIs. However, the success of contact tracing ultimately rests on people’s willingness to enroll and update their health status.
We can’t depend on malicious actors, e.g. terrorists, traffickers, etc., to voluntarily disclose information. We need rigorous legal and judicial to determine who can violate people’s privacy and with what legitimate authorization when there is a credible threat to security.
Four Letters That Are Key to Your Privacy
GDPR (2016), partly buoyed by Snowden’s leaks, differs from the US “Notice and Choice” approach.GDPR granted new digital rights to EU residents, e.g. companies must inform users of data collection, the purpose of the collection, and who has access; users can obtain, correct, erase and transfer their data; users can stop companies from processing their data, etc. GDPR fines can be up to 4% of the company’s global revenues.
California Consumer Privacy Act of 2018 (CCPA) gives CA consumers rights to know about data collected; to delete data (with some exceptions); to opt out of the sale of their data; to non-discrimination for exercising CCPA rights.
Desirable Outcomes of Privacy Laws that Serves Citizens
Treating privacy as a right, as opposed to a preference, to shift power from companies to citizens.
A commitment to more meaningful forms of consent, e.g. narrow and understandable ToSs, sensible defaults, privacy preferences that can shared between platforms, etc.
A credible, legitimate government agency that has the power to preserve and protect the privacy rights defined by new legislation. The FTC is regarded toothless in the Notice and Choice world.