Blog: Good news, bad news from the Citizenship Survey
Written by: Chris Game
You may have seen the headline in the LGC: ‘SURVEY SHOWS RECORD HIGH TRUST IN COUNCILS’ (lgcplus, September 23), and the explanatory opening sentence: ‘The last-ever government survey of public attitudes towards citizenship has found trust in local government at a record high.’
In which case, you may possibly have been reminded of the long-running Radio 4 programme, I’m Sorry I Haven’t a Clue, one of whose popular games takes the form of a potentially infinite extension of the classic ‘Good News, Bad News’ joke:
Good news: the Russians are putting a Briton into space;
Bad news: it's not Jeffrey Archer;
Good news: it’s Robert Maxwell;
Bad news: he's going to nobble the Sky satellite; satellite;
Good news: he'll succeed; and so on.
This week’s local government version might run along the lines of:
Good news: the latest Citizenship Survey shows that people’s trust in their local councils is the highest ever recorded;
Bad news: the Government is ending the Survey – it’s the last there’ll be;
Good news: yes, but the DCLG said in response to the consultation that key data can still be collected through other surveys;
Bad news: but the most obvious one, the Place Survey, they’re axing as well;
Good news: which means that in local government we can simply project the future trend from the last 10 years’ data, and no one can argue. A 12% increase in trust between 2001 (52%) and 2011 (64%) means that, by 2041, 100% of people will trust their local councils;
Bad news: and roughly how many of these local councils do you reckon will still be around in 2041?
OK, forget the apocalyptic last line. The Citizenship Survey – or Communities Study, as it is also known – really is going, and its passing is worth noting. It was launched in 2001 as the Home Office Citizenship Survey (HOCS) – an initially biennial social survey covering the areas of community cohesion, race and faith, volunteering and civil renewal, which at the time fell largely within the HO’s remit. In 2006, however, responsibility for these topics switched to DCLG and the Citizenship Survey (CS) followed, moving in 2007 to a continuous design, allowing the release of headline findings on a quarterly cycle. The national (England and Wales) sample is a large one: approximately 10,000 interviews (2,500 per quarter), plus 5,000 ‘boost’ interviews with ethnic minorities, and a separate 1,200 Muslim boost to ensure at least 3,000 Muslim respondents. The sampling, fieldwork, data processing and analysis are contracted out to Ipsos MORI and TNS-BMRB.
I’m no expert in these matters, but to me these specifications amount to what might in a research bid be termed ‘the gold standard’. Indeed, DCLG boasts as much in its methodological introduction to the 2010-11 CS, describing it as ‘a flagship survey for DCLG; used to measure performance as well as to inform and develop complex policy areas. The survey provides a wealth of information and data for a range of stakeholders across Government and the wider research community.’ What wasn’t mentioned here, but was in the Department’s response to its consultation on the CS’s future, was that the cost of providing this cornucopia of data to a substantially extra-departmental user community was an estimated £4 million per survey year – at a time when the Department was looking for a 40% cut in its administrative budget - http://www.communities.gov.uk/documents/communities/pdf/1866399.pdf.
It seems unlikely in the circumstances that the Secretary of State agonised unduly over this particular saving, and on this occasion at least one can see his point of view: all that stuff about quarterly monitoring of key indicators, and benefitting the wider research community, who, even in some of their consultation evidence, didn’t seem to ‘get’ what this Government is about. Arguing that a principal use of the survey data is to monitor the impacts of policies such as the Equalities Act or the well-being agenda is unlikely to cut much ice with a department that, as the civil servants’ response drily noted, ‘is moving away from costly top-down monitoring and measurement of policies and does not believe that the costs of the survey can be justified for these purposes’.
Other governments in other times might have been readier to consider moving to a silver or even bronze standard – in which case one question raised would have been about the added research value of the quarterly updating of statistics, many of which measure behaviours and attitudes that seem intrinsically unlikely to fluctuate greatly over such short time periods. In the final section of this blog, the main purpose of which is to use some of the 2010-11 headline findings to illustrate the CS’s major concerns, attention will also be drawn to this issue of stability or variability - http://www.communities.gov.uk/documents/statistics/pdf/1992885.pdf.
Community Action – influencing decisions
22% felt they could influence decisions affecting Britain – an increase on 2009-10 (20%), but ‘no clear trend across previous years’.
38% felt they could influence decisions in their local area – ‘unchanged [from] all previous years, apart from 2001 (44%)’.
44% said they would like to be more involved in Council decisions affecting their local area, 39% said they wouldn’t; and 18% said it would depend on the issue. The proportion wanting to be more involved was unchanged on 2009-10, but lower than in earlier years.
34% said they had engaged in civic participation at least once in the previous 12 months – contacting an elected representative, taking part in a public demonstration or protest, signing a petition.‘This figure was unchanged on 2009-10 but lower than in any year before then (between 38% and 39%)’.
10% said they had engaged in civic activism in the previous 12 months – involved either in direct decision-making about services or issues, or in the actual provision of these services in a role such as councillor, school governor or magistrate. ‘This figure was unchanged upon all previous years’.
Trust in institutions – see Figure below
36% trusted Parliament either ‘a lot’ or a ‘fair amount’; a rise since 2009-10 (29%) when the proportion dropped relative to all previous years (34%-38%).
64% trusted their local council, higher than in all previous years (52%-62%), in each of which the figure had risen (p.10).
Community spirit – cohesion
86% thought their community was cohesive – a place where people from different backgrounds got on well together. The figure had increased fractionally but consistently each survey year from 80% in 2003.
Satisfaction with local area
66% thought that their local area had not changed much over the past two years. 16% thought that the area had got better, and 18% thought it had got worse – the latter figure having declined steadily each year from 27% in 2007-08.
Attitudes to immigration
77% thought the number of immigrants coming to Britain should be reduced (a little/a lot); 3% thought the number should be increased. ‘There have been no clear patterns in trends over time’.
No clear patterns, that is, over the lifetime of the CS. But, as shown even in the small sample of findings above, the Survey has identified some significant and fairly unambiguous trends, in which, both as researchers and citizens, we can have a confidence that no collection of one-off surveys, however professionally conducted, can provide. It will unquestionably be missed. In a small tribute, then, we shall end as we began, with a good news trend – in fact, a couple of them:
64% said they were not very/not at all worried about becoming a victim of crime – a higher figure than in any previous survey year (since 2005). Over that period the proportion ‘not at all worried’ increased from 10% to 22%.
76% said they felt very/fairly safe walking alone in their neighbourhood after dark – again the highest figure than in all previous years from 2001.
Read more of INLOGOVs blog posts at http://inlogov.com