Yesterday Mark Zuckerberg of Facebook apologized to Facebook users after the uproar that has resulted from Facebook's latest feature Beacon. The idea behind Beacon is to "help people share information with their friends about things they do on the web." That is Beacon allows Facebook members to share information about their online activity, purchasing a book or posting a product review on a Facebook partner site with others in their social network. Zuckerberk relates that this "simple idea" missed the "right balance" between not "get in people's way ... but also clear enough so people would be able to easily control what they shared." As a result his apology to all notes that changes have been made, including the default behavior of Beacon, which was switched from 'opt-out' to 'opt-in'.
While reading his post, in part because of a prompt from Sare, I realized I've heard this discussion before, its a common view of 'security' vs 'ease-of-use' a lot of programmers have. Well the similarity makes sense, after all personal privacy, the what/where/when/how of sharing, is at its root an issue of security. Hence Zuckerberg's framing of the good/bad/ugly of Beacon version 1, that the two are at diametric opposites; adding security complicates the user experience whereas removing security eases the user experience and that one needs to 'balance' the two at any given time in software development.
The thing is, is I don't really buy that. I mean, yes that might always seem to be the case, but I think that has more to do with the fact that we programmers have painted ourselves into that corner by thinking of the two issues as polar opposites for sometime now. Moreover, I think it becomes an issue of lazy programming since we can say, "it an either/or proposition, pick one and that's what we live with since I can't/don't want to develop something different."
(For a more conventional, not to mention cynical, spin on Facebook's Beacon check out Steven Levy's Do Real Friends Share Ads? article for Newsweek in which Levy suggests that in the rush to maximize Microsoft's $240 million investment Facebook didn't have its user's best interest in mind at all.)
An illustration of my point, a few weeks ago Bruce Schneier posted about a video showing how to circumvent a soda machine. The posting got me thinking about how 'back in the day' it was common knowledge in my high school that one could exploit the dollar bill reader by fooling them with one-sided, black and white, photocopies. If I had to guess, based on the observed behavior, the readers simply cared about being given a piece of paper of a specific length and width that at some point matched the pattern for a One Dollar Bill (some pattern that, I assume, was dissimilar enough to say a Five Dollar Bill). No color matching, matching backside, etc. Today those same readers are more sophisticated, that old 'trick' won't work. Yet the reader's 'user interface' is still the same you orient the bill as pictured, slide the bill in and at some point the reader grabs hold of it, either accepting or rejecting your offering. You, the user, don't have to do anything new, different or complicated, yet the 'security' of the system is greatly enhanced. Sure some readers can seem overly fussy and frustrating, but I've also seen readers that care little about the orientation of the bill, easing use, without, I'm sure, exposing the machine to past vulnerabilities.
The soda machine issue also demonstrates another point about computer security since the 'cracking' of the vending machine is an excellent example of that ultimately it is not about having the programming code in front of you so much as the behaviors, expected and unexpected, that code details that can cause a security issue. In the case of the soda machine video someone discovered how to get the machine to 'fail' and then exploited that to their advantage. In the case of my high school reminiscing it was about literally given the machine what it expected. As Schneier notes this is a simple enough exploit, no source code needed, just a little patience by the observer who determines what behaviors the machine expects by how it reacted.
A few months ago I tried to make the same observation about, ironically enough, a 'security breach' at Facebook when some PHP source code got 'leaked' onto the Internet. It would seem the same can be said for Beacon, its not the code itself that can be an issue but the, in this case, expected behavior that actually becomes a possible security/privacy issue for the user.
The point, if there really is any here, is that on the surface computer security and personal privacy can look cut and dry, good or bad, usability or security, black or white. But, as with those old dollar bill readers, if you ignore the other side, read only in black and white and look only for what your expecting, you can get fooled fairly easy. Oh and that teenagers have a logic all their own since the thought of breaking Federal-counterfeiting law is worth the price of a 'free' soda.