I attended a session on ‘theory’ as I was one of the speakers! I don’t perceive my work as particularly theoretical but it was a very interesting session and I”m glad that I ended up there. My paper, Expression 2.0: from known unknowns to unknown knowns, will be posted on the BILETA website and I’ll link it then; here are my slides (PDF). I spoke about the control of expression by social networking and web 2.0 hosts and also by internet service providers, particularly through terms of use. In the second part of the paper, I looked at different approaches to freedom of expression, freedom of communication, and the relationship between human rights and private parties.

Next up was Hayley Hooper, who talked about new technologies and the enduring role of constitutional rights. Recent trends promote ‘liberal legalism’ and a particular, marketised approach of supranational constitutional development.

1. Model of legal constitutionalism; not a purely structural theory : outlined by Alan Tomkins – separate to politics, in the courtroom, control of government, etc. While they sound reasonable, argued that they are normatively undesirable. There is a need to look at supranational entities as they have more constitutional power and control over citizens. We haven’t moved beyond inherent bias in constitutions, despite what some may say. The bias that Hayley suggests in the EU context is market-based, e.g. the four (economic) freedoms. The judiciary are suitable for this model; dealing with socio-economic rights is difficult.

2. UK and the development of judicial review. Until Malone v Metropolitan Police (telephone tapping), couldn’t make human rights claim at that time (no legal or equitable claim), everything is permitted unless it is forbidden. But the judiciary became much more activist in ‘discovering’ constitutional rights in the common law … but the record isn’t consistent, the EHCR incorporation hasn’t reinforced this trend. Pro-Life Alliance as an example, due deference applied. So despite the upsurge, the judiciary have a particular mindset, they won’t go into the more controversial areas.

3. Juristocracy. Recent developments in the UK show a trend towards this. Judicial, economic and political elites have converging/confluent interests, which is on the back of neo-liberal technological progress. What are the consequences? US situation is interesting, especially in the political debate re: Roe v Wade. The Government’s ideas reinforce the juristocracy trend; the 1997 election generated it and it’s been ongoing since then.

The discussion touched on Dworkin, Gearty, alternative approaches to judging and decision-making, the role of ‘governance’. And I enjoyed it a lot.

Finally, Martina Gillen (Oxford Brookes) spoke about developing a ‘Sociology of Law 2.0’. Our identity has always been ‘defined’ in a way by technology – even from the definition of homo sapiens. In Durkheim’s original classification, he focused on divisions of labour etc; but buried in it is a differentiation on the use of technology; the elephant in the living room. It’s shaping what’s happening but theoretically it’s poor. Our tendancy to want ‘legal certainty’ has given us a mechanical/scientific mindset. We’re (we as lawyers) viewing technology as something we should be ‘attracted’ to and we want the analogy to apply to law.

We focus on ‘nodes’ but what are we missing? There’s also the proliferation of economic control and interests. “Just because it’s new, does that make it significant”. And Martina also mentioned the ‘cult of the shiny’ which is something I could rant about but won’t…

What would the new features be? public sphere; multiple jurisdictions; inter-sections.

We then saw a very useful diagram about security models (I haven’t got the technological ability to reproduce it but I’ll link to it when published), arguing to focus on what the user is actually doing! It seems trite but it’s a key feature that has been ignored. We need to move beyond categories and consider normative change. Finally, Martina outlined a research method proposal, based on proxies, to study *what* people are doing on the Internet and *why*. The conclusion, then, is that we need to use the technology to get data.