Warning: liveblog, my impressions, not direct quotes, will link to actual presentation when available. Don’t shoot the messenger!

Our first session is the keynote address by the excellent James Grimmelmann, of New York University and more. He’s talking about Facebook as an example of social networking and privacy as an example of a controversial issue – but hoping to get broader themes from that. The presentation draws upon his forthcoming Iowa Law Review article of the same name, with a draft available here. Sites we are interested in at this conference are ‘social’ network sites and that is important – both for the positive and the negative aspects. To illustrate, he shows a video of ghost-riding the whip (hopping out of a car while still ‘driving’ it) – it would be a category error to say that this is ’caused’ by or attributable to the technology (car), and therefore we would need to consider the social factors in order to discuss it. The same applies to social networking – whether for the Ghost Riding The Whip Association’s Facebook page (no, really) – and therefore we will look at the social assessment and harms in relation to privacy and Facebook.

(1) Why are these sites ‘social’ networks? These can be general, but also more specific (like Ravelry for knitting). The common theme (with assistance from boyd & Ellison) is profiles (about ‘I’ and identity), links (relationships – ‘you’) and the social graph (community – ‘them’). Grimmelmann shows various examples of all three (based on his own presence and that of others), drawing some good laughs with profiles of people in the audience and also the quirkier side of the social graph.

(2) It’s no surprise, then, that we use social skills/tools to assess the risks attached with social networking. Old contradictions like ‘you wouldn’t jump off a bridge if your friends did’ vs ‘safety in numbers’ remain present. You are alone, yet surrounded. What we believe (e.g. in relation to knowledge of being watched) may not match up that well with the actual privacy risks. In a useful phrase, Grimmelmann describes these ‘cognitive shortcuts’ as heuristics rather than calculations.

(3) What are the harms, then? With assistance from Solove’s taxonomy, we are talking about
disclosure: information that you don’t want shared, shared (photos, upsetting your employers, breaking up with your (un)beloved…and even law enforcement). Estimates around five cases a week where a court has a significant reference to something on social networking sites;
surveillance: the feeling of being watched, with the panopticon being a problem even if specific information is not ‘misused’ ; how does something like the Facebook newsfeed relate to this?;
instability: things ‘change’, such as the opening up of certain information through Facebook public profiles, not to mention things like Beacon and new viruses;
disagreement: something like removing tags on photos is a possible disagreement that may not have occurred to you before. Another example would be the evolution of social norms in terms of ‘defriending’ someone, or the ‘top 8’ feature on certain sites;
spillover: my decisions have consequences for your privacy. This becomes even more important as the number of users of Facebook (or other sites) increases – makes it a different situation when it’s a more limited group.
denigration: other users affect your self-presentation. Here we talk about Beacon too, where purchases show up in the news feeds of others. Your endorsement is giving to a product, in many cases with your knowledge or genuine consent.

(4) Solutions? The important question is how do proposed solutions match with social habits? If you argue for privacy policies, for example, the problem is that people don’t read it – or indeed end up less informed after reading the (lengthy) policy. Indeed, the policies tend to include very generous get-out clauses. Similarly, in the case of technical controls (e.g. Facebook privacy settings), only 1 in 5 people ever change their settings – and even for those that do, it’s hard to know in advance what a photo is going to consist of so settings are by nature vague and crude, and subject to abuse by unreliable contacts. Data portability is of interest, based on market pressure (we’ll take our data and go) – but can get you banned, and can violate the trust (and privacy) of others in any event.

(5) Three (disquieting) conclusions: it’s hard to partition good and bad behaviour, with the motivations, mistakes and harms deeply linked, all based on connection; privacy violations are often ‘peer-produced’ rather than the expected Government/corporate ‘watching us’ (that is there too, but it’s not the only thing); sites like Facebook are a ‘privacy virus’, in that we spread information and encourage others to act in a (possibly) risky way.

In Q&A, Miquel Peguera (and others) mention the recent A29 Working Party document on social networking (in PDF here); Grimmelmann gives a shout-out to Ian Brown and Lilian Edwards (not here, but now here in spirit!) for their proposals on default settings. In response to a ‘devil’s advocate’ point from Ismael (“privacy doesn’t matter”), Grimmelmann discussed the evolution of his own paper, moving from a point like this to an understanding of how users make choices and what the consequences of these choices are, or indeed how the choice does not deliver the expected or desired result. A questioner (didn’t catch the name) makes a thoughtful intervention on the psychological impact of information sharing, in terms of the way that people will act when they wouldn’t have before.

My own question related to the surprisingly high-profile appointment of new lobbyists by Facebook and what approach it is likely to take with regard to possible legislation on relevant issues. Grimmelmann notes the high profile of privacy counsel Chris Kelly, noting that Facebook is ‘one of the most arrogant companies out there; in terms of general statements about how their model is changing the world, but also very accommodating in terms of privacy, admits mistakes, talks about enhancing user control – they take the problem seriously. Turning to legislation, they might exploit the situation and accept minimum legislative mandates for competitive reasons – or alternatively, oppose it and use (non-mandated) privacy controls as a competitive advantage.