Before fall quarter began and after finishing Boneh's crypto course on Coursera, I looked for other upcoming classes that might be interesting. I stumbled across one on Bitcoin and Cryptocurrencies, taught in-part by Princeton professor Arvind Narayanan. I had to put the course aside for the quarter, but come break, I serially consumed the dozen or so lectures. Since I was looking for a topic to teach Spring 2016 and since I felt that this course could be readily adapted into an undergraduate seminar, I decided that cryptocurrencies would be a solid choice. Unfortunately, the online course made only the lectures and textbook publicly available, so after pruning the course down to the correct size, I went to find the email address of the instructors and ended up perusing around their websites, especially Narayanan's, for quite some time.
An excellent introduction to Narayanan is the 2012 Wired article that dubbed him the "World's Most Wired Computer Scientist." I found him to be most notable because he exists at the locus of meaningful fields (privacy, security, and machine learning/data mining) and because of the approach he avows: solve significant, meaningful problems with a combination of technologically-novel applications, wide-spread education and public policy advocacy. His article "What Happened to the Crypto Dream?" (Parts 1 and 2) is a welcome review of the failure of the cypherpunks community to make a lasting impact and a call for cryptographers to return their focus (at least partially) to crypto-for-privacy, away from crypto-for-security (or as Rogaway suggests, crypto-for-crypto). The front page of his website prominently features Princeton's Web Transparency and Accountability Project, which led me to watch his seminar overview of the project, titled "Ending the Online Panopticon."
Ending the Online Panopticon
For those unfamiliar with "panopticon," the word refers to a hypothetical surveillance facility proposed in the late 18th century by Jeremy Bentham in which a prison warden can observe prisoners without the prisoners knowing when they are being observed, instead knowing only that they can be observed at any time; as Wikipedia explains, "Although it is physically impossible for the single watchman to observe all cells at once, the fact that the inmates cannot know when they are being watched means that all inmates must act as though they are watched at all times, effectively controlling their own behaviour constantly." The word is frequently invoked in discussions concerning mass surveillance, as it is an apt analogy that rightfully invokes the disapproval humans share for the "reductive, mechanistic and inhumane approach to human lives." Wikipedia has a list of famous literary works that invoke the panopticon, almost all negatively; perhaps the most famous is Foucault's Discipline and Punish, which argues (in some sense) that the panopticon is a symbol of power and suppression of dissent. (Fun fact: Orwell's 1984's telescreens were based on the panopticon.)
Narayanan's talk begins by telling the tale of how he sought an answer to the question of what he could do to bring transparency to, and ultimately bring greater accountability to, web tracking. He terms this the "online panopticon," which I disagree with, as his talk focuses entirely on private-industry web tracking for economic profit and ignores the deeply political nature of the panopticon. He discusses the success that previous attempts had, such as Do Not Track and the Wall Street Journal's What They Know (which provided investigative journalism that revealed to consumers how their usage of web services was being tracked). He argues that What They Know arguably had the most impact on web services by motivating consumers to exert pressure. The presumption is that these consumers care enough to take the time to learn what their web services are doing with the information they collect and are intelligent enough to form concrete, specific, realistic demands when protesting/negotiating, which I find doubtful. Just recently, SnapChat updated their user agreement, granting themselves "a perpetual licence to view, host, publish, display, promote and adapt the content in any way the app chooses to do so." After asking my three roommates (prolific SnapChat users) whether they planned to change their behavior, they all replied with a negative. A paper that Narayanan references in the aforementioned Crypto dream article makes the point that "it is unrealistic to expect individual rationality in this context. Models of self-control problems and immediate gratification offer more realistic descriptions of the decision process and are more consistent with currently available data." Granted, having information on what data companies collect and how they use it is valuable, and Narayanan notes that non-automated attempts to gather than information (such as What They Know) will ultimately fall behind; this motivated the creation of WebTAP. He goes on to cover some of the difficulties his team faced and invites others to collaborate on the open-source project, which you're welcome to explore on your own time. Suffice to say, I'm quite impressed.