Tag Archives: access to information

A Good American

Surveillance, Self-Censorship and Social Justice

This week I took part in a panel discussion and Q&A following the screening of the film A Good American, which tells the story of individuals involved in the development of the ThinThread surveillance programme in the USA and how it was killed off by the NSA in favour of the more expensive, intrusive and ineffective Trailblazer programme. The film was incredibly interesting and educational, and I’d seriously recommend giving it a watch if you can. As someone relatively new to issues around mass surveillance, I thought the film provided a really easy to follow and engaging history and insight into the technology of why and how mass surveillance functions, and the implications for people’s privacy.

We were also honoured to be joined by Rebecca Vincent, UK Bureau Director for Reporters Without Borders, as well as Bill Binney, a former NSA Technical Director, and Kirk Wiebe, a former NSA Senior Analyst. Bill and Kirk featured heavily in the film itself and were two of the key individuals behind the ThinThread programme. Being able to ask them questions and hear their views on the UK and the implications of the Investigatory Powers Act was a real privilege, albeit in a very worrying context.

I was asked to talk about the implications of mass surveillance on freedom of expression and access to information, as well as the role of libraries and librarians in helping people protect their privacy. For once, I wrote a rough script! I’ve posted it below.


David McMenemy and I are currently working with Nik and Scottish PEN on a study of Scottish writers’ conceptions of surveillance and its potential impact on freedom of expression. This is a follow-up study to a survey conducted by American PEN and PEN International in other countries. PEN American Center (2013) says:

We know—historically, from writers and intellectuals in the Soviet Bloc, and contemporaneously from writers, thinkers, and artists in China, Iran, and elsewhere—that aggressive surveillance regimes limit discourse and distort the flow of information and ideas. But what about the new democratic surveillance states?

PEN’s original study gave participants the chance to discuss their concerns around surveillance, and the significant themes included writers’ self-censorship and fear that their communications would bring harm to themselves, their friends, and their sources.

They found that writers are self-censoring their work and their online activity due to their fears that commenting on, researching, or writing about certain issues will cause them harm. For example, writers reported self-censoring on subjects including military affairs, the Middle East North Africa region, mass incarceration, drug policies, pornography, the Occupy movement, the study of certain languages, and criticism of the U.S. government.

The fear of surveillance—and doubt over the way in which the government intends to use the data it gathers — has prompted writers to change their behaviour in numerous ways that curtail their freedom of expression and restrict the free flow of information.

For example, significant numbers have:

  • Curtailed or avoided social media activities,
  • Deliberately avoided certain topics in phone or email conversations,
  • Avoided writing or speaking about a particular topic,
  • Refrained from conducting internet searches or visiting websites on topics that may be considered controversial or suspicious,
  • Taken extra steps to disguise or cover their digital footprints, or
  • Declined opportunities to meet or speak to people who might be deemed security threats by the government.

We have replicated this study in the Scottish context, and an initial look at the results shows very similar findings. We are seeing that writers are following news stories about government surveillance efforts within the UK, are worried about current levels of government surveillance of Britons, and have concerns about corporate and government surveillance.

The behaviour being described by writers, about the steps they are taking to protect themselves from becoming victims of the surveillance state, in many cases takes the form of self-censorship. They are simply not engaging with areas of intellectual and public life that they otherwise would do.

Implications of self-censorship

One troubling aspect of self-censorship is that it is impossible to know what contributions to society are being lost because of it. PEN (2013) raises the important issue that “we will never know what books or articles may have been written that would have shaped the world’s thinking on a particular topic if they are not written because potential authors are afraid that their work would invite retribution”. We know that many writers, academics and members of society more widely, are hesitant to communicate their thoughts because of rational concerns around surveillance.

This has implications not only for culture, but for social justice and human rights.

Social justice and human rights

From a social justice perspective, surveillance creates a panoptical environment in which people’s sense of being watched affects their everyday lives. People respond differently to these circumstances; some feeling more secure and safe, and others much less so. We simply do not know enough about the psychological impacts of living under highly surveilled circumstances to anticipate what impact it will have on people throughout the course of their lives. We do know that members of minority groups are more likely to be surveilled (Renderos 2016), thereby adding to the existing conditions of relative disadvantage and increased systemic violence and oppression. As Malkia Cyril states, “lawful democratic activism is being monitored illegally without a warrant” and encryption technologies offer vulnerable groups such as people of colour, immigrants, welfare recipients and political activists who challenge the status quo, the ability to more safely exercise their democratic rights (Renderos 2016).


Avoiding mass surveillance is not a simple case of opting out of using certain resources. Even people using the most secure tools that offer protection against surveillance of content (what is being said) cannot protect themselves fully from being surveilled at the level of metadata (when/where/to whom it is being said – which in itself provides a lot of detail about what may have been said). Additionally, many people feel like they can’t avoid engaging with insecure means of communication that the majority of their networks and wider society are engaging with, if they want to avoid marginalisation. However, many people simply do not comprehend the extent of surveillance made possible by these technologies – they do not know the extent of the surveillance they are subject to. Whereas many of the participants in our self-censorship and surveillance survey described their awareness and the steps they have taken to increase their security, writers are largely a relatively privileged group. Members of society more widely do not have the benefits and knowledge that many of us do have.

I think we therefore need to teach the public about surveillance – both to help raise awareness of the fallacy that “if you have nothing to hide you have nothing to fear” (Coustick-Deal 2015) and help people to resist it, through challenging policies and laws as well as equipping themselves with the skills and resources to protect their privacy. There is an increasing interest in this work from librarians who want to help their users protect their online security in terms of both corporate and state surveillance. Scottish PEN has been working with the US-based Library Freedom Project to develop a toolkit for libraries so they can advise users on the software and practices they can employ to protect themselves. Libraries and groups like the Open Rights Group and Radical Librarians Collective have held cryptoparties to help people with their ‘privacy checklists’ around encryption and other actions they can take.

We need to do more than this, however. As educators, librarians need to resist policies and interventions such as the Prevent initiative, which asks university and school staff to watch out for the ‘potential radicalisation’ of the students in their institutions. The Government has implemented training on how to spot ‘radical ideologies’ (including Islamic extremism and anti-capitalist agendas) and legally binds them to report these to the authorities who then have the right to question their friends and family, seize any and all academic work by the suspected student, and investigate other aspects of their public and private lives. For example, a student at Staffordshire University on their Terrorism, Crime and Global Security course was questioned by university security after being reported by library staff for being seen reading a book about terrorism, in the library. He subsequently withdrew from his course. This is one of many accounts of actions that Ali Milani (2016) describes as “creating and propagating a narrative of suspicion around an entire community”.

With the rise of the surveillance state, these events are going to become more common, and have more of an impact on people’s rights to education, freedom of thought and freedom of expression. Even without the explicit removal of these rights, the oppressive systems of surveillance we are increasingly encountering will have extremely negative impacts on the universal rights of those who most need them.


Coustick-Deal, R. (2015). Responding to “Nothing to hide, Nothing to fear”. https://www.openrightsgroup.org/blog/2015/responding-to-nothing-to-hide-nothing-to-fear

Milani, A. (2016). Dear Owen Smith – Backing the Racist Prevent Strategy Won’t Win You This Election, It’ll Lose Labour Votes. Huffington Post Blog, 12th August 2016. http://www.huffingtonpost.co.uk/ali-milani/owen-smith-prevent-strategy_b_11468406.html

PEN American Center (2013). Chilling Effects: NSA Surveillance Drives U.S. Writers to Self-Censor.  https://pen.org/sites/default/files/Chilling%20Effects_PEN%20American.pdf

Renderos, S. (2016). To the next POTUS: For communities of colour, encryption is a civil right. TechCrunch, 6th May 2016. https://techcrunch.com/2016/05/06/to-the-next-potus-for-communities-of-color-encryption-is-a-civil-right/?utm_content=bufferc64aa&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer

Tucker, I. Ellis, D. and Harper, D. (2016) Experiencing the ‘surveillance society’. The psychologist, 29, pp.682-685. https://thepsychologist.bps.org.uk/volume-29/september/experiencing-surveillance-society

[Image: Still from A Good American, Slingshot Films]

Content Filtering in Libraries

Happy New Year! Just a quick post on this poor neglected blog to signpost to some research done by some people involved in the Radical Librarians Collective on content filtering in public libraries.

The study sought to find out what filtering is in place within public libraries, because there is the potential for excessive filtering to act as a barrier to freedom of access to information. The team felt that although filtering is a very tricky topic and there are often good reasons for libraries to want to filter content, that the methods used to do so may take a very broad brush approach with the potential to do more harm than good. This builds on the MAIPLE study conducted by Loughborough University.

The research team used Freedom of Information requests to ask every local authority in the UK the following questions:

1. Do you employ the use of content filtering software on the PCs
based in your libraries which are connected to the internet and
intended for use by the users of your library?

If answer to 1. is “yes”, please:

2. Provide the name and annual cost of the content filtering

3. Provide a full list of the categories of websites blocked (e.g.
“pornography, gambling, phishing etc.”). If these differ according
to the user profile accessing the PC (e.g. child, student, adult,
staff etc.) please provide a full list of categories of websites
blocked for each user profile.

4. Confirm whether you also block specific URLs in addition to
categories, and provide a complete list of these URLs.

5. Provide the relevant policy document or written documentation
which outlines the procedure a user must follow in instances where
they would like to gain access to a website that is blocked.

6. From January 2013 until the present day, please provide a list
of the URLs where users have requested access to despite them being
blocked by the content filtering software.

7. Of the list provided in 6., please detail which URLs access was
granted for and which were denied.

Most local authorities provided information (although some did refuse). The data was collated and has now been published on figshare.

The aim of the research team is to do some analysis of the key trends and write an article around it, as well as to present the work at the LILAC Conference in Dublin in March.

The data has been picked up by The Register and I was asked to talk to them about it for an article they published today.