You are on page 1of 24

This article was downloaded by:[Williams College] [Williams College] On: 30 May 2007 Access Details: [subscription number

762292902] Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Policy Studies
Dirk Haubrich; Iain McLean

Publication details, including instructions for authors and subscription information: http://www.informaworld.com/smpp/title~content=t713442426

EVALUATING THE PERFORMANCE OF LOCAL GOVERNMENT

To cite this Article: Haubrich, Dirk and McLean, Iain , 'EVALUATING THE PERFORMANCE OF LOCAL GOVERNMENT', Policy Studies, 27:4, 271 - 293 To link to this article: DOI: 10.1080/01442870601009939 URL: http://dx.doi.org/10.1080/01442870601009939

PLEASE SCROLL DOWN FOR ARTICLE Full terms and conditions of use: http://www.informaworld.com/terms-and-conditions-of-access.pdf This article maybe used for research, teaching and private study purposes. Any substantial or systematic reproduction, re-distribution, re-selling, loan or sub-licensing, systematic supply or distribution in any form to anyone is expressly forbidden. The publisher does not give any warranty express or implied or make any representation that the contents will be complete or accurate or up to date. The accuracy of any instructions, formulae and drug doses should be independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims, proceedings, demand or costs or damages whatsoever or howsoever caused arising directly or indirectly in connection with or arising out of the use of this material. Taylor and Francis 2007

Downloaded By: [Williams College] At: 14:28 30 May 2007

EVALUATING THE PERFORMANCE OF LOCAL GOVERNMENT A comparison of the assessment regimes in England, Scotland and Wales
Dirk Haubrich and Iain McLean

Compared with most other industrialised nations, the UK government places greatest weight on performance assessments of local authorities as a tool to ensure high levels of public service standards and efficiency of public spending in areas such as education, social services, housing, culture, and benefits administration. Annual comprehensive performance assessments (CPA), published by the Audit Commission for England, are now an integral part of the central local government nexus. By contrast, Wales and Scotland have embarked on different routes in the post-devolution era and have developed assessment frameworks that are much less prescriptive, less intrusive, and more reliant on self-assessment. Drawing on 20 semi-structured elite interviews with auditors, auditees, and other stakeholders in the three nations, this article evaluates the lessons learned from the respective assessment regimes. It also assesses critically the plans to replace the English CPA system from 2008 onwards with a regime that emulates Wales and Scotland-style self-assessments carried out by auditees themselves.

Introduction
The UK government places ever-greater weight on performance assessment of public bodies as a tool to ensure high levels of public service standards and efficiency of public spending. Top-Down Performance Management has become one of the four main components of the governments model for public service reform (with market incentives, capability and capacity improvement, and users shaping services bottom-up being the other three). Assessment of the service performance of local government in England in particular has reached an unprecedented level of sophistication, complexity, formal structure, and prescription. Using hundreds of performance indicators and a multitude of inspection and audit judgements, the Audit Commission publishes comprehensive performance assessment (CPA) ratings for all local authorities in England (counties, districts and unitary authorities) that aim to describe how effectively central governmental funds have been used. Under the heading of earned autonomy, the assessments result in certain freedoms and flexibilities granted to authorities, provided satisfactory rating levels have been achieved. The UK government oversees local government in England, yet it does not do so in Scotland and Wales, where it is the responsibility of the devolved administrations in each country. It comes as no surprise, then, that both countries have gone down different routes Policy Studies, Vol. 27, No 4, 2006
ISSN 0144-2872 print/1470-1006 online/06/04271-23 2006 Taylor & Francis DOI: 10.1080/01442870601009939

272

DIRK HAUBRICH & IAIN MCLEAN

in the post-devolution era, and the potential for differentiation within Britain is accordingly high. Seven years of devolution, four successive annual CPA rounds in England, and the assessment regimes in Wales and Scotland also firmly in place make a comparative assessment of the assessment frameworks particularly recommendable. Have they achieved the improvements in service delivery for which they were initially conceived? Do the costs to run the schemes merit the benefits derived from them? After all, monitoring local government in England alone currently costs 2.5 billion, which includes an estimated 700 million for the existing service inspectorates, but excludes the cost accruing to local authorities themselves (Economist, 2006 ; Improvement and Development Agency, 2006, p. 14). Local government in England spends 120 billion per annum (Office of the Deputy Prime Minister [ODPM], 2005, p. 21), which means that the costs of auditing local government amounts to 2.1 per cent of the expenditure audited. Which lessons can the three nations learn from their respective neighbours failures and successes? How viable are recent proposals to replace CPA in England from 2008 onwards with a scheme that relies much more on authorities self-assessment, similar to the regimes already in place in Scotland and Wales? This article uses a mix of qualitative and quantitative research data to develop some answers to these questions. The quantitative data consists of the CPA scores attained by all upper-tier and single-tier authorities in England that were publicly available in late 2005 (which comprises the first three CPA rounds carried out in 2002, 2003, and 2004). The qualitative data, in turn, are based on transcripts produced from 20 semi-structured eliteinterviews held with auditors, auditees and other stakeholders throughout Britain. These took place in Wales, with the Wales Audit Office (WAO), the Welsh Assembly Government, the Welsh Local Government Association, and Chief Executives and/or Heads of Performance of four local authorities; in Scotland, with Audit Scotland, the Scottish Executive, the Scottish Improvement Service, the Convention of Scottish Local Authorities (COSLA), and Heads of Performance and/or Chief Executives of two local authorities; and in England with the Audit Commission and Chief Executives/Heads of Performance of six local authorities. The argument unfolds across four sections. The first three sections outline the assessment approaches adopted in England, Scotland, and Wales, respectively, based on publicly available documentation and the views expressed during our interviews. The final substantive section then draws out the comparative advantages and disadvantages of the three assessment approaches and, using the quantitative data from the English CPA results in 2002 2004, appraises recent proposals (discussed favourably by auditors and auditees alike) to substitute or alter the English CPA regime so as to allow local authorities, from 2008 onwards, to self-assess their performance. Before doing so it is worth rehearsing briefly in this introduction the structure of local government in Britain in order to provide some context to the discussion that follows. In the 1990s, the view of central government was that the two-tier model of local services provision that had existed for nearly two decades (whereby county and district councils assumed split responsibility for services) was inefficient and confusing, and that the county councils were too remote from those they served. For this and other reasons the government decided that county councils should be abolished and their functions

Downloaded By: [Williams College] At: 14:28 30 May 2007

EVALUATING THE PERFORMANCE OF LOCAL GOVERNMENT

273

transferred to district councils, with some of the smaller districts being merged. In Scotland and Wales this is exactly what was done. In England, however, there was a process of local consultation which led to the single-tier model being implemented in some places but rejected (quite frequently) in others. Where single-tier councils were implemented, they were called unitary authorities. Table 1 illustrates that, although the number of authorities was reduced from 520 to 441, the resultant structure of local government became anything but clear. Unitary authorities are responsible for the complete range of services, from education and social welfare to environmental protection, housing, and leisure. In the two-tier system, in turn, the functions are split between the county councils, which provide education, welfare and environmental protection, and non-metropolitan (shire) districts, which provide housing, waste collection and leisure. This structure was kept in place after the Labour party gained office in 1997, and has remained largely unchanged ever since (Stoker, 2005, p. 31). Under the system of devolution that was formalised in Britain in 1999, two constituent countries within the United Kingdom, Scotland and Wales, voted for limited self-government, nominally subject to modifications introduced by the UK Parliament in Westminster. One of the consequences of this process has been that central government responsibility and financial support for local government in England has fallen to the Office of the Deputy Prime Minister, in Scotland to the Scottish Executive, and in Wales to the Welsh Assembly Local Government Group. These institutions, all of which we visited for interviews, are responsible for devising the assessment frameworks to audit authorities located in their respective jurisdiction. The comparison made in this article, then, is between the English local authorities that became subjected to the CPA regime in 2002 (marked as groups a, b, c, and f in Table 1), on the one hand, and their counterparts in Wales and Scotland (d and e, respectively, in Table 1) on the other. By contrast, group g the English non-metropolitan district councils was subjected to a different CPA exercise that was not implemented until 2005. These authorities are therefore not included in this analysis (for details, see Audit Commission, 2005).

Downloaded By: [Williams College] At: 14:28 30 May 2007

England
England is divided into 388 local authorities, the sizes of which vary considerably: the most populated unitary authority area is Birmingham (a metropolitan district) with

TABLE 1 The structure of elected local government in Britain post-1997


Single-tier authorities (a) 46 English unitary councils (b) 36 English Metropolitan districts (c) 32 London boroughs (d) 22 Welsh councils (e) 32 Scottish councils Two-tier authorities (with split functions) (f) 34 English county councils (g) 238 English non-metropolitan districts

(excludes Isle of Scilly; Corp. of London)

274

DIRK HAUBRICH & IAIN MCLEAN


ABILITY TO IMPROVE Self-Assessment and Corporate Assessment

Downloaded By: [Williams College] At: 14:28 30 May 2007

ASSESSMENT FRAMEWORK

OVERALL ASSESSMENT - Excellent - Good - Fair - Weak - Poor

PUBLIC REPORTING IMPROVEMENT PLANNING

Libraries & Leisure

Use of resources

Environment

Social Care

Education

CURRENT CURRENTPERFORMANCE PERFORMANCEON ONSERVICES SERVICES Inspection InspectionJudgements, Judgements,auditor auditorjudgements, judgements, performance performanceindicators, indicators,and andgovernmental governmental Assessments Assessmentsof ofcouncils councils plans plans

Housing

Benefits

Audit Commission 2002

FIGURE 1 The CPA performance management framework

980,000 inhabitants, and the least populated is Rutland with 35,000. However, these are outliers, and most unitary authorities have a population in the range 150,000 300,000. In 2000, the UK Parliament passed the Local Government Act 2000 to force councils to move to an executive-based system, either with the council leader and a cabinet acting as an executive authority, or with a directly elected mayor. The Council leader cannot do the work of the council himself or herself, and so is responsible for appointment and oversight of officers, who are delegated to perform most tasks. In order to lead these organisations, councils also need to appoint a Chief Executive Officer, with overall responsibility for council employees, and who operates in conjunction with department heads. Employing 2.1 million people, local authorities are one of the largest employers in England and undertake an estimated 700 different functions. These functions are assessed with a performance management system that was conceived in 2001 by the ODPM, which was in charge of local government at the time, and the Audit Commission. The Audit Commission had been founded in 1983 to perform the then unprecedented task of checking not only that public money is spent for authorised purposes, but that the investment is done effectively and efficiently. With its 2500 employees, the Commission today is the regulatory arm of central government charged with responsibilities that range well beyond traditional audits and encompass comprehensive reviews of the performance of local authorities, a duty confirmed most recently in the Local Government Act 1999 (HMSO, 1999). The Commissions formal governing board is made up of several Commissioners and a Chairman, who are appointed by the

EVALUATING THE PERFORMANCE OF LOCAL GOVERNMENT

275

department responsible for local government (ODPM, in 2006 changed to the Department for Communities and Local Government, DCLG), following consultation with key stakeholders. In a 2001 government White Paper, an assessment framework to that effect was first formally proposed to the public and consultations commenced with the aim to conceive a final framework (Audit Commission, 2001). The stated objective of what has since then been dubbed the CPA was, and continues to be, the targeting of support to those councils that need it most as well as to grant freedom and flexibilities to the better performing authorities. These incentives entailed: (a) the elimination of ring-fencing from most central governmental grants to the local authority, which allows the latter to spend money on those areas it (rather than central government) deems most appropriate; (b) a three-year exemption from subsequent audit inspections; (c) an exemption from having a cap imposed on the authoritys planned expenditure and the level of council tax it is allowed to raise from its taxpayers; and (d) the freedom not to have to submit detailed service plans to central government for approval. The resultant CPA framework was aimed at measuring the effectiveness of councils in terms of the way they provide services to local people and work in partnership with other local authorities, private corporations, and the voluntary sector, among others. In so doing, it focused on the leadership, the systems, and processes, as well as the performance of those services. As Figure 1 depicts, the framework comprises two distinct elements: current performance and ability to improve in the future with numerous transformations and categorisations applied in the process (for a more detailed exploration, see Haubrich and McLean, 2006, pp. 93 8).

Downloaded By: [Williams College] At: 14:28 30 May 2007

Current Performance
Much of the evidence on service performance gathered in this exercise had been publicly available already and was brought together in one place for the first time in CPA. The service areas (or service blocks) assessed are the six core services of education, social care for children, social care for adults, housing, environment, and libraries and leisure facilities. Where available, performance is assessed through already existing judgements from inspectorates and auditors, such as those by Ofsted and the DfES for the education service block. Numerous categorisations and conversions are applied to summarise more than 1,000 performance indicators and auditor judgements, with the final result that localities obtain a score between one and four for each of the service blocks (with one being the lowest, four the highest). The scores are then weighted so that the scores for education and social services count four times, housing and environmental services count two times, with the remaining blocks counting only once. These are then added up so as to produce a performance score of between 15 and 60 points or 12 and 48 points for county councils as they do not provide (and are therefore not assessed on) housing or benefits services. The performance scores are then categorised so as to produce a performance rating of between one and four for each authority (see Table 2).

276

DIRK HAUBRICH & IAIN MCLEAN


TABLE 2 The conversion of CPA assessment scores to category scores
Performance score (Group f) Less than 24 points 24 to 29 points 30 to 36 points More than 36 points Category score 1 2 3 4 Performance score (Groups a, b, c) Less than 30 points 30 to 37 points 38 to 45 points More than 45 points Category score 1 2 3 4

Downloaded By: [Williams College] At: 14:28 30 May 2007

Ability to Improve
A second assessment concentrates on a councils plan to improve services in the future , which itself consists of two components: first, a self-assessment in which authorities have to answer four questions as to what they were trying to achieve, how they had set about delivering their priorities, what they had achieved so far, and what they planned to do next; and second, an external corporate assessment carried out by a small team that includes an auditor, an inspector, and officers and members from peer councils, all of whom are tasked with checking the statements made by authorities in the selfassessment. A total of nine areas are assessed that way, each of which obtain a score of between one and four (with one being the lowest, four the highest). These are then weighted in order to produce an overall corporate assessment score that ranges between 12 and 48 points before it undergoes yet another conversion from continuous data to categorical data, again ranging between one and four. The outcome of this second assessment is a report that highlights the councils strengths and weaknesses, and that produces a score and rating for its ability to improve on a four-point scale. The two assessment ratings (on current performance and ability to improve) are then combined to produce the overall assessment on a five-point scale comprising the denominations excellent, good, fair, weak or poor (which in late 2005 were replaced by NHS-style 0 star to 4 star denominations). Final CPA ratings are not, however, a product of simple arithmetic. Additional minimum thresholds are applied in the calculations to account for cases where one of the councils ratings significantly deviates from the other. Table 3 provides an overview of the resulting score matrix that determines councils eventual CPA rating. After three successive annual CPA rounds in 2002, 2003, and 2004, a mixed picture emerges as regards the schemes achievements. CPA has helped sorting out some very poorly performing authorities through the setting of bottom line minimums standards, in-depth guidance, additional monitoring boards for failing authorities, and peercounselling. Despite the increased attention and the influx of inspection teams (one poor authority we visited told of visits from seven different inspectorates and auditors within a six-month period), these measures were regarded as necessary steps to get the authority to a state it otherwise would not have reached. The publication of assessment results and league tables has raised the stakes for councils and has increased the pressure of accountability. CPA has spurred the

EVALUATING THE PERFORMANCE OF LOCAL GOVERNMENT


TABLE 3 Final CPA ratings
Councils ability to improve 1 2 3 4 Councils performance score on core services 1 2 3 Poor Poor Weak N/A Poor Weak Fair Good Weak Fair Good Excellent

277

Downloaded By: [Williams College] At: 14:28 30 May 2007

4 N/A Good Excellent Excellent

implementation of new performance management frameworks within authorities and the development of new strategic plans. It has also led to a refocus of attention on organisational structure, systems, and processes (although apparently at the expense of a focus on outcomes and user satisfaction). Among our interview partners, these assessments were shared by both auditors and auditees. CPA has also introduced a new focus for the work done by authorities members of staff. The impact has not necessarily been greatest among top managers, but an unexpected benefit of CPA has been that more junior and frontline members of staff have been motivated by CPA, as they identify with the CPA ratings their employer achieves. Even so, our interview partners stated various areas where the assessment regime requires improvements. For one, the validity of some indicators used in CPA was questioned. For example, the indicator Number of visits to the library in the leisure services block was said to say little about the authoritys performance if the librarys stated mission is to provide as much information as possible via the Internet. Similarly, the indicator Number of hostels for the homeless in the housing block is of only limited value if it is the authoritys stated mission to have no hostels at all (but instead provide the homeless with proper temporary accommodation). The issue of perverse incentives with regard to indicators was also raised. For example, in the education service block, headmasters are assessed inter alia against the attendance rate their school achieves. This produces the perverse incentive for them to close down the school altogether in times of very bad weather, given the low attendance rates to be expected otherwise. Indicators, so the interviewers stated, should not be abolished altogether, but used in different ways. Whereas indicators themselves are seen as beneficial, the setting of targets for each of them is not in line with a dictum coined by Charles Goodhart, the former chief economist of the Bank of England, that when a measure becomes a target, it ceases to be a good measure. A phenomenon related to the validity of indicators is a type of regulatory failure frequently referred to in the public policy literature. Interviewees in local authorities admitted to what could be called procedural compliance; that is, going through the motions of producing the documents and data necessary to satisfy the procedural requirements for attaining a higher score on a specific indicator, without actually improving the quality of the underlying service. Power (1997) referred to this phenomenon as a decoupling of formal regulatory processes and substantive organisational processes. Under CPA, authorities are focused on improving the CPA rating on each single indicator,

278

DIRK HAUBRICH & IAIN MCLEAN

which provides them with an incentive to do as little as possible to move beyond the threshold of the next rating category on that indicator. During the interviews it became obvious that this approach is not necessarily available to all authorities: those that are clearly failing will not be able to get away with superficial changes only, because the discrepancies in service delivery tend to be so obvious (and the outside scrutiny so intense) that the regulatee can do nothing but ensure real service improvements. Yet, it appears to be a frequently used option for authorities that are more or less settled (i.e. want to move up from their present fair or good category ratings). The interviewees in local authorities also criticised that the initial aim of earned autonomy, which central government had heralded at the outset as an incentive for authorities to perform well, has slipped off the agenda. For when authorities succeed in improving their performance ratings, no greater autonomy or lighter-touch inspections has been granted in return. Some interviewees attributed this to the fact that the assessment results are not only influenced by an authoritys real service performance, but also by the views that central government officials and auditors have formed over many years prior to (and irrespective of) any of the recent CPA exercises. These preconceptions are persistent and do not change merely on account of one years improved CPA rating. For that to happen in the long run, so one interviewee stated, an authoritys Chief Executive and Council Leader need to make sure they exploit every networking opportunity available that would allow them to portray their authority before officials as one with which one can do business. At times, CPA ratings and scores were regarded as not reflecting accurately an authoritys performance, because it was felt that many good ideas could be learned from the badly performing authorities. One interview partner stated that a neighbouring authority enjoyed beacon status for its performance on race relations, but was categorised as weak overall. Another authority still was leading the way on (and was attracting visiting parties from other authorities for) its work on business planning processes, but was rated fair overall. CPA ratings do therefore appear not to capture how progressive an authority is or to what extent it is a learning organisation, characteristics deemed crucial for many policy-makers. Other concerns raised during the interviews referred to the fact that control over some service blocks, such as education, are taken increasingly out of authorities hands but attain increasing weight in the CPA exercise. The assessment exercise, so the criticism goes, should also be based more on self-regulation and be less bureaucratic. The suggestion was that the Audit Commission and other inspectorates should check the extent to which the authority regulates itself effectively, rather than regulating it directly and that it should accept the fact that party politics may have a bearing on the priorities assigned to the various services blocks.

Downloaded By: [Williams College] At: 14:28 30 May 2007

Scotland
Scotland is subject to the administration of both the UK Government in Westminster and the Scottish Executive in Edinburgh. The UK Government has responsibility for issues

EVALUATING THE PERFORMANCE OF LOCAL GOVERNMENT

279

such as constitutional matters, foreign policy, and defence, whereas the remit of the Scottish Executive includes matters such as health, education, law and local government. As a result, the programmes of legislation enacted by the Scottish Parliament on this latter set of policy domains have seen a divergence compared with the rest of the United Kingdom, with the costs of university education and care services for the elderly free at point of use being the most frequently cited examples. The 1994 Local Government (Scotland) Act led to the abolition of the previous structure of nine regions and 53 districts, although the three island councils remained. Since then, Scotland has been divided into 32 units that are unitary administrations with responsibility for all areas of local government (see Figure 1). Akin to England, each local authority has a Chief Executive and is governed by a council consisting of elected councillors, who are elected every four years with the City of Glasgow being the largest with more than 600,000 inhabitants, and Orkney being the smallest with less than 20,000. Scottish councils cooperate through and are represented collectively by the COSLA. Today, local government in Scotland employs some 270,000 full-time-equivalent staff. Scotlands framework for assessing local authorities performance called the Best Value Audit is distinct from England in several respects. To start with the institutional set-up, assessments are, unlike England, the responsibility of three different audit bodies: Audit Scotland, the Auditor General, and the Accounts Commission. The duty of, first, the Accounts Commission is to check that public money used by local authorities is spent properly, efficiently, and effectively. The Accounts Commission is an executive NonDepartmental Public Body, independent of the bodies audited and the government of the day. The Commission was established in September 1974 and, since the introduction of the Public Accountability (Scotland) Act 2000 and the transfer of its staff to Audit Scotland (see below), is now just a Board that is serviced by Audit Scotland. The independence of the Commission is emphasised by the fact that it does not receive governmental grants. Instead, its activities are funded by contributions made by audited bodies in terms of regulations made by the Secretary of State for Scotland. The Commission is not a Crown body and its employees are not civil servants. It works within a statutory framework, for which the Secretary of State is responsible. Members of the Commission serve in their personal capacity, and not as nominees or representatives of any interest group. By statute, its six to 12 lay members are appointed by Scottish Executive Ministers. Collectively, the Commission has powers to report and make recommendations to the organisations it scrutinises, to hold hearings, to report and make recommendations to Scottish Executive Ministers, and to take action against councillors and council officials should their negligence or misconduct lead to money being lost or breaks the law. With such tribunal powers, which in extremis could even lead to the abolishment of a council, the Accounts Commission in Scotland has wider duties than the Audit Commission in England. Yet, various parties during our interviews in Scotland expressed the view that these intervention powers are not clearly defined and thus constitute no decisive disincentive for auditees to perform badly and receive bad audit reports as a result. One interviewee recalled from the parliamentary debates that preceded and prepared the Local Government in Scotland Act 2003 that Parliament was clearly not interested in granting the Executive or Ministers too far-reaching intervention powers, a

Downloaded By: [Williams College] At: 14:28 30 May 2007

280

DIRK HAUBRICH & IAIN MCLEAN

claim supported by the observation that, unlike England, most Members of the Scottish Parliament have prior experience in local government. As a result, ministers have no control over the audits, and heavy-handed (albeit vaguely specified) intervention is an option only once all other steps are exhausted. The Auditor General (short for Comptroller and Auditor General), in turn, is appointed by the Crown on the joint recommendation of the Prime Minister and the Chairman of the Committee of Public Accounts who is, by convention, an Opposition MP. He is independent; accountable to the Scottish Parliament; an Officer of the House of Commons; can be removed from office only by the Queen on an address from both Houses of Parliament; and his salary is paid directly from the Consolidated Fund without requiring the annual approval of the Executive or Parliament. He reports directly to Parliament and his reports are considered by the Committee of Public Accounts, which uses them as a basis for examining the performance of senior civil servants. The main components and duties of audit and budgetary control are similar between the Accounts Commission and the Auditor General, although the former has responsibility for the 32 local authorities, while the latter has the additional remit for local NHS trusts and Health Boards, the departments of the Scottish Executive, and the colleges of further education (Audit Scotland, 2004). The essential difference is the lines of accountability and the extent to which the bodies are associated with, and report to, either the Parliament (in the case of the Auditor General) or the Executive (the Accounts Commission) (Audit Scotland, 2006a). Finally, the role of Audit Scotland is to provide the two aforementioned bodies with the services they need to carry out their respective duties. It is an independent entity responsible for holding to account nearly 200 public bodies in Scotland. Two-thirds of the audits are carried out by Audit Scotland staff, with the remainder assigned to private auditors.1 Audit Scotland has emerged from a predecessor with UK-wide jurisdiction, the National Audit Office. The 10 local authority functions that are assessed by these three bodies deviate slightly from the seven service blocks audited in England. They cover social work, benefits administration, cultural and community services, education and childrens services, corporate management, development services, housing, protective services, roads and lighting, and waste management. Together, they consume an annual local government budget worth 14.5 billion, which is financed primarily by central government grants of 10.1 billion (in the form of Revenue Support Grants of 5.3 billion, redistributed nondomestic rate income of 2 billion, and dedicated grants of 2.8 billion), council tax revenues worth 2 billion, housing rents of 0.9 billion, and income from sales, fees and charges of 1.5 billion (Audit Scotland, 2006b, p. 11). Unlike England, central grants in Scotland are not ring-fenced, nor does the grant amount vary depending on the audit results (as is the case with the policy agreements in Wales, see below). The greatest departure from England is found in the assessment regime that the three audit bodies have devised since the Local Government in Scotland Act 2003 came into effect. The framework is much less prescriptive than the English approach and is based predominantly on flexible self-assessments carried out by the authorities themselves. Called Best Value Audit, the assessment approach is based on criteria set out by the Scottish Executive (Scottish Executive, 2005). The choice of indicators which

Downloaded By: [Williams College] At: 14:28 30 May 2007

EVALUATING THE PERFORMANCE OF LOCAL GOVERNMENT

281

tend to measure processes, not outcomes appears to be a task reserved to the Scottish Executive, whereas the description of the methodology by which they are to be collected is the responsibility of Audit Scotland. This somewhat shared and overlapping responsibility has caused confusion at various junctures and is exacerbated by the fact that Audit Scotland does not have the capacity of validating how the data are eventually collected. No league tables or composite rankings are published, and comparisons are made on the basis of 46 Statutory Performance Indicators (SPIs) across the 10 service areas, rather than the 1,000 or so indicators collected in England.2 The focus, then, lies not so much on the indicators themselves, but that authorities are able to show (a) that policies are in place, (b) that policies are firmly rooted in what the citizens living in the authority want, (c) what outcomes to expect from the policies, (d) how to measure the outcomes, and (e) what the policies cost. At the time of writing this article in spring 2006, 12 out of the 32 Scottish authorities had been subjected to this type of Best Value Audit, which is carried out in a three-year rolling rhythm. So far, none of them has been challenged or contested by the authorities concerned. The view expressed by all interview partners, among both auditors and auditees, is that the small number of local authorities in Scotland allows the relationship between auditor, local government department, and local authority to be more intimate than in England: it is physically possible to gather all Chief Executives in a single room to discuss strategy and issues, an option unavailable to the Audit Commission with its 388 authorities in England. League tables are rejected because officers across the country tend to know one another and do not want to be put into a position of direct competition with each other. Instead, the assessment framework is marked by a partnership approach between auditors and auditees, with a commitment of the former to not only audit, but also support the latter. The resultant lack of comparability between authorities performances appears to be no significant issue in Scotland: unlike Wales (see below), none of our interview partners wished to move to a more prescriptive and standardised approach. However, when authorities were asked about this latter issue they complained that many auditors (and inspectorates) continue to operate with what was referred to as an old-style approach: the organisations were said to concentrate more on the assessment part of their function, at the expense of their support role, and provide only very limited guidance on how authorities can actually improve their performance. Crucially, where improvement plans are formulated (which constitutes section IV of each Audit report), they are hardly ever followed up later and checked for successful implementation. The Scottish approach grants councils substantial freedom in deciding what to be assessed on. The SPIs aside, councils are not obliged (but merely encouraged) to produce additional performance data, an approach that deviates markedly from the much more prescriptive and detailed scheme in England. Audit Scotland has criticised repeatedly that auditees use this flexibility in a one-sided manner so as to report exclusively on the SPIs, rather than on issues specific to the community concerned, and to exaggerate their successes at the expense of the service functions that require improvement. Local authorities, in turn, complain about too many overlapping inspections to which they are subjected throughout the year. The criticism is also voiced in England and

Downloaded By: [Williams College] At: 14:28 30 May 2007

282

DIRK HAUBRICH & IAIN MCLEAN

Wales, where it seems to have led to more successful policy changes. In Scotland, up to a dozen inspectorates and auditors knock on authorities doors with the aim to take stock of councils public service delivery. One of the authorities interviewed reported that its housing department had been audited four times within the preceding nine months by Community Scotland, the Social Care Inspectorate, the Benefit Fraud Inspectorate, and the Best Value Inspection Team leading to considerable exhaustion within that department. More often than not, these bodies require identical sets of data, yet hardly ever in the same format or on the same medium. In an attempt to save time and resources, the authority in question had developed a generic template for an assessment model that records performance data in one place, once a year, and in a way suitable for several inspectorates/audits. This model has since then been offered to all visiting auditors and inspectorates, and has generated great interest among neighbouring authorities. Although it may take this particular authority a long time before the various inspectorates, auditors, councils, and members of parliament are convinced of the merits of the template, the bottom-up drive to standardise a centrally driven audit and inspection process stands in marked contrast to the prescriptive topdown approach imposed on English authorities in the past four years. Some further drawbacks of the Scottish approach emerged during our interviews, some of which are unique to Scotland, others akin to problems found in England and Wales. To start with, similar to England and Wales, some of the statutory performance indicators are regarded as superfluous or invalid, because they fail to measure the value provided to the public. An example given by various interview partners is the indicator that measures the number of swimmers per square metre of pool water in a local authoritys leisure service function. In order to produce a positive outcome on this indicator, the local authority would only need to close all but one of its pools. This would increase the ratio but would hardly be regarded as an indication of good public service. Alternatively, and less drastically, a department manager may want to decide to improve the indicator, by filling up available pool slots with swimming clubs (the members of which will be happy to swim up and down the pool lanes in great numbers) rather than providing swimming lessons to the public (which would result in a relatively small number of participants occupying a large area of the pool for their swimming exercises). Clearly, the public would be better served with the latter policy, but the indicator would require from an authority to pursue the former. Indicators such as these have survived from the pre-devolution era, were (and still are) defined by the Scottish Executive and the respective governing bodies (in the case of swimming, the pool managers), but need urgent revising. Secondly, the great flexibility granted to local authorities means that they are not treated equally and consistently, a problem also reported by auditees in Wales. Coupled with the uncertainties mentioned above regarding the exact intervention powers available to the Accounts Commission, this has led to a situation where the audit reports produced by the Auditor General are regarded as too bland and too uniform and not hard-hitting enough. One interviewee highlighted the case of Inverclyde, which was known to be a badly performing authority and which has been the only council that saw its Chief

Downloaded By: [Williams College] At: 14:28 30 May 2007

EVALUATING THE PERFORMANCE OF LOCAL GOVERNMENT

283

Executive replaced on the grounds of intervention powers applied by the Accounts Commission. While it was commonly acknowledged that this was the right step to pursue, the audit report for Inverclyde that had been produced just a year before did not differ significantly from other authorities audit reports: the appraisal and criticism of Inverclydes strengths and weaknesses read, by and large, similar to the audit reports produced for other authorities that year. Only the subsequent commentary produced by the Accounts Commission was specific and critical of individuals and circumstances and led to the eventual displacement of the Chief Executive. Inverclyde proved the point that, if required, the intervention powers are anything but mere potential and can be significant and real. Yet, the case also proved that the audit reports themselves are not explicit enough in highlighting problems and depend on contextualisation and inside knowledge (in this case, on the part of the Accounts Commission) that adds meaning to the text. That being so, the valid question to ask is what is the point in producing an Audit Report in the first place? Thirdly, despite the small size of the country and local government community, some stakeholders regretted the few networking opportunities that remained after COSLA had withdrawn from this area a few years before. Whereas Wales prides itself in forming bottom-up benchmarking clubs and groups of similarly structured authorities in order to share best practices and compare performances, these processes appear to be much less developed in Scotland. The Scottish Improvement Service was set up in 2004 to fill this void, by helping to improve the efficiency, quality, and accountability of public services through learning and sharing knowledge. The organisation is a partnership between the Scottish Executive, the COSLA and the Society of Local Authority Chief Executives, and is financed by the Scottish Executive. Finally, but only indirectly related to the assessment approach as such, authority boundaries are deemed contentious and are often devised for political reasons. A need has been expressed during the interviews to merge some authorities as a means to improve human resources and systems capacities in the smaller localities.

Downloaded By: [Williams College] At: 14:28 30 May 2007

Wales
The Welsh local government system was closely integrated with the English system until devolution in 1999 and thus contrasts with the Scottish which never has been, its distinctness being asserted in the Treaty and Acts of Union of 1707. Since 1999, Wales has been subject to the administration of both the UK Government in Westminster and the National Assembly for Wales in Cardiff. The UK Government retains responsibility for all primary legislation, but the National Assembly has powers to make secondary legislation in a range of policy areas such as health, education, industry, agriculture, local government, environment, and culture. The Assembly not only legislates but also funds, develops and implements policy for much of the public sector in Wales. Devolution therefore is creating an increasingly diverse and diverging public policy agenda across the United Kingdom.

284

DIRK HAUBRICH & IAIN MCLEAN

Following the 1994 Local Government (Wales) Act, the eight counties and 37 districts of Wales were replaced by 22 unitary authorities with responsibilities for all aspects of local government. They are responsible for 4 billion of annual public expenditure, one-third of the total Welsh budget. Eighty per cent of this budget is financed through central governmental grants, with the remainder coming from council tax revenues. Local government is also one of the largest employers in Wales, with some 164,000 employees. Akin to Scotland, then, Wales contains much fewer local authorities than England, so the relationship between auditor, local government department, and authority can be more intimate than in England. League tables are avoided because the local authority community is small enough for all senior managers to know one another. Putting local authorities in a position of direct competition with one another is therefore not desired. Instead, the audit reports for each authority contain a section (somewhat confusingly denominated Risk Assessment) that provide a contextualised account of the service areas in which the respective authority has to improve. In these sections, councils have to categorise themselves for each service area as low, medium or high risk, an assessment that appears to be a combination of performance quality and importance weighting. Although the Welsh Local Government Association has developed a Risk Assessment Template of how to go about this task, its uniform application is not guaranteed and risk assessments are said to be done in 22 different ways across the nation. Local authorities in Wales are assessed by the WAO, an organisation that was created on 1 April 2005 following the passing of the Public Audit (Wales) Act 2004. The Act brought together the former (pre-devolution) offices of the Audit Commission in Wales, which was responsible for auditing and inspecting local Welsh public services, and the National Audit Office in Wales, whose role was to audit the National Assembly and its sponsored and related public bodies, into one body headed by the Auditor General for Wales. Wales now has a single audit and inspection body, responsible annually for the audit of over 19 billion of public expenditure at all levels of administration, including local government. The assessment regime dubbed Wales Programme for Improvement, WPI (WAO, 2006) that is overseen by the WAO is, as in Scotland, much less prescriptive and much less elaborate than its English counterpart. It is not only based on flexible self-assessments carried out by authorities, but authorities have also a say with regard to the type of inspection they would like to see for specific areas (brief reviews, in-depth audits, etc.). No league tables or composite rankings are published. Instead, comparisons on the basis of selected performance indicators can be made, through the website maintained by the Local Government Data Unit, the custodian for all matters related to performance data. However, the detailed results (including the risk assessment categories mentioned above) in an authoritys audit report remain confidential; no one other than the authority in question and the auditors will see them, and nothing is communicated through the media either. The absence of rankings may be advantageous in the sense that is does not put individuals who know one another into a position of direct competition, but it is counterproductive in another: the fact that such rankings are available for English authorities across the border induces some media in Wales to produce their own league

Downloaded By: [Williams College] At: 14:28 30 May 2007

EVALUATING THE PERFORMANCE OF LOCAL GOVERNMENT

285

tables. Nearly all interview partners regarded it a great disadvantage that journalists from Welsh newspapers such as the Western Mail publish a self-selected set of performance indicators, because an authority has neither influence nor an opportunity to explain the composition of (or rationale behind) the performance indicators. Also, keeping the reports away from the public does little to improve democratic accountability of local government. The WPI focuses on internal administrative processes rather than the quality of the services delivered. Efforts are presently directed at addressing this imbalance by developing ways to measure output or even outcome. To that end, a survey-based approach called Local Voices was developed in 2000, with the aim of capturing users satisfaction with an authoritys performance. Replacing or supplementing conventional mathematical performance indicators with such satisfaction data has been on the wish list of the Audit Commission in England for quite some time, too, but the Welsh Assembly went furthest in devising a scheme accordingly. However, the initiative has so far not been implemented on an all-Wales scale, although some authorities that we visited reported that they have used and implemented scaled-down versions for their own local purposes. A revival may be in the making, however, given that a parliamentary inquiry led by Sir Jeremy Beecham has recently embarked on examining service delivery in Wales, a remit that includes a large-scale perception study to capture the satisfaction with public services based on a sample of 1,000 individuals. Performance indicators in Wales are set by the Welsh Assembly, not a government department (the ODPM in England) or the government itself (the Executive in Scotland), and, as said, tend to focus on processes rather than outcomes. Also, only two or three of them make any statement about the costs incurred through service deliveries. The relationship between the WAO and the Welsh Assembly is marked by a partnership approach with regard to the detailed formulation of and guidance on performance improvement. One interviewee pointed to the difference compared with Scotland, where he believes to have noticed tension between Audit Scotland and the Executive with regard to the choice and definition of indicators. However, this selfproclaimed partnership approach in Wales has in no way reduced the criticisms levelled at the indicators, as authorities continue to complain about the limited validity of some of them. For example, it is questioned why a high expenditure per pupil should be regarded as bad performance (in terms of efficiency), when the general public would consider it to be an advantage. There is a strong sense of a partnership not only between the Assembly and the Audit Office, but also between the local authorities and the Audit Office, a judgement shared equally by both parties. In fact, some authorities we interviewed tend to develop their yearly improvement plans jointly with the WAO. It represents a noticeable progression from the pre-devolution era, which Welsh authorities are reminded of each time an inspectorate with joint jurisdiction for England and Wales pays them a visit (such as the Benefit Frauds Inspectorate). The task of the auditor thus entails assessment and support. That said, during the interviews some authorities expressed their discontent with the lack of guidance received by either the Audit Office or the Welsh Assembly, a deficiency

Downloaded By: [Williams College] At: 14:28 30 May 2007

286

DIRK HAUBRICH & IAIN MCLEAN

readily acknowledged by both these bodies. Also, some complained that in the communication triangle between WAO, Welsh Assembly and authorities there is no feedback loop built into the third side of the triangle, that between the Assembly Government, as the body that devises the WPI, and the authorities, as the entities having to live with it. Some limited extra funds are made available to local authorities if they meet 16 specified indicators measuring national strategic objectives. These are called policy agreements and can amount to additional funds of between 0.5 and 1.5 per cent of an authoritys budget. The 16 indicators are the only ones that are audited, predominantly with regard to the methods through which data are used. The majority of indicators, some of which can be chosen by authorities themselves, are not checked for accuracy or standardised interpretation. Given that central grants in Wales are generally not ringfenced, some interviewees criticised that these additional funds are simply added to the overall central grant that the authority receives, rather than earmarking them exclusively for the service areas that are assessed against those 16 indicators. Unlike England, staff are not incentivised by good performance categories or lesser ring-fencing attached to central grants. The main motivational drivers to do well appear to be the work ethos of members of staff, political pressure by members of the council, and the threat of more audits if a bad assessment report is produced. Conversely, however, good-performing authorities criticise that they are not rewarded by lighter-touch inspections and audits as they were promised. As the most important reasons for improvements achieved in an authoritys performance, the interviewees listed better work processes and work plans, better Information Technology systems, as well as staff motivation and leadership. The coordination between the various auditors and inspectorates appears to be working particularly well: Relationship Managers have been institutionalised, who report to the WAO and have the task to coordinate auditors and inspectors. Unlike England or Scotland, these positions had some lasting effect on the assessment process: authorities obtain a regulatory plan that charts for each of them all of the visits scheduled for the following year. Not all of the inspectorates have yet signed up to this scheme. The issue that remains is the flurry of presently 34 local plans that councils have to submit and that are in need of rationalisation. With regard to intervention powers, the WAO (or its appointed auditors) can make a recommendation to the Welsh Assembly Government for (a) and intervention, or (b) a formal inspection that results in a peer advisory council to be set up to oversee further improvements in the authority. Unlike Scotland, no senior officers in Welsh local authorities have had to be replaced as of yet, although in 2005 three of the 22 authorities were subjected to one of the two types of intervention. However, given the unclear definition of these powers and the lack of specification as to the body on which these powers should fall, they appear to be as ineffective a deterrent as those in Scotland. During our interviews, statements to that effect were made by auditors and auditees alike. Similar unanimity in feedback emerged on the question of comparability (or lack thereof) of the assessment reports produced for each authority. Most individuals we spoke to yearned for elements of the more prescriptive English system to be implemented, as it

Downloaded By: [Williams College] At: 14:28 30 May 2007

EVALUATING THE PERFORMANCE OF LOCAL GOVERNMENT

287

would allow for explicit comparisons to be made between authorities performance. For the much-lauded flexibility of the Welsh approach makes consistency, transparency and comparative analysis unattainable. This is so despite the existence of benchmarking clubs, which have been formed through bottom-up initiatives, are informal, and do not necessarily encompass all authorities in Wales, but only those that regard themselves of a similar type. As Sir Jeremy Beecham recently confirmed during an appearance before the Welsh Assembly, the flexibility that was granted to Welsh authorities to secure their buy-in has resulted in a situation where it is difficult to compare authorities either across authorities or across time (National Assembly for Wales, 2006, pp. 27 33). Local authorities in the English Welsh border region were particularly interested in comparisons, not only with their Welsh peers but also with authorities in England, with whom they compete for tourism and inward investment. A similar desire was expressed by the metropolitan authorities in Wales, which would like to be able to compare themselves with their metropolitan peers elsewhere. Yet, given that there are only very few urban authorities in Wales, these comparisons would have to be made with authorities across the border, such as Plymouth, Exeter, or Bristol. The interview partners regretted that such comparisons are even less attainable than those within Wales, because devolution has meant that the assessment approaches in the two countries continue to drift apart. The collaborative partnership approach in Wales between, first, auditees and auditors and, second, the auditors and the Welsh Assembly have produced another disadvantage that many of our interview partners picked up on: developing and implementing ideas that are agreeable to all takes an extraordinary amount of time in Wales. If decisions are taken at all, they take a long time to come about, let alone implement, as there is no clear body that would take the lead and drive the necessary changes through the system. Initiatives are hampered by lengthy collaboration procedures, too much bureaucracy, excessively long decision-making procedures, and the introduction of too many new organisational layers. It should therefore come as no surprise that authorities complained about the limited validity of some performance indicators as early as 2002 (when the indicators were described more poignantly as crap; see Boyne et al ., 2003, p. 7).

Downloaded By: [Williams College] At: 14:28 30 May 2007

The Limits of Self-Assessment


The previous sections provided detailed accounts of the assessment regimes prevalent in the three countries. The set-up of local government in Scotland and Wales grew out of the conviction that devolution does not stop in Edinburgh or Cardiff, but that it is more far-reaching and entails also a respect for local government. As a result, Scottish and Welsh authorities enjoy many freedoms and flexibilities when being assessed. Our interviews confirm that auditees tend to regard the process as a fair way of assessing local performance, because it is based on self-assessment, rather than an externally imposed, uniform, and often adversarial scheme. By comparison, the English regime with its external inspections, the crude categorisations, the subjection of failing councils to punitive interventions, the concessions towards flexibility and freedoms made only if centralist conditions are met, and the playing authorities off against each other looks essentially like a centralist exercise.

288

DIRK HAUBRICH & IAIN MCLEAN

The focus in Wales is on the identification of external and internal risks rather than composite categorisations; and (internal) organisational performance such as management capacity, leadership, processes, systems, culture, etc. is regarded as only one of many risk factors, with (external) strategic, environmental, national, and European risks enjoying equal importance. That way, the Welsh regime allows for regional priorities and circumstances to be reflected in the assessment, which the uniform CPA system in England does not. From the differences outlined above, then, the reliance on authorities selfassessment is the most striking feature in the Welsh and Scottish regimes. Interestingly, it is also the one that has spawned the greatest interest among stakeholders in England. For one, the Audit Commission for England whose task it is to produce a successor regime for CPA for the assessment rounds from 2008 onwards is actively engaged in identifying the features that are worth adapting and incorporating into the English scheme. Also, the authors visit to Wales to conduct interviews with Welsh auditors and auditees happened to coincide with members of the Welsh Assembly Government following an invitation to the ODPM in London to present their approach. Two months later, members of the Welsh Assembly repeated the exercise and presented at a conference on local government in England, which was sponsored by the English Local Government Association. Both host organisations were very receptive to the ideas presented and sought to use them as a basis to launch a new alternative approach to performance management that is owned and driven forward by local government staff, not by external agencies (Improvement and Development Agency, 2006). Yet, despite the analytically interesting cross-country fertilisation made possible through devolution, the question to be asked is how desirable such a convergence of the assessment approaches actually is? How beneficial would it be to introduce additional elements of self-assessment into the audit process in England and, to use a terminology introduced by Hood et al. (1998), move the regulatory system from an oversight to a mutuality approach? The lack of bite, the limited transparency, and inadequate opportunities for comparative analyses in Wales and Scotland were mentioned above as the downsides of self-assessment. The greater buy-in from local authorities, the flexibility to tailor the assessment to local circumstances, the better suitability for small geographical areas, and the greater cost-effectiveness were stated as its advantages. But further drawbacks come to light once the analytical focus moves from the devolved nations to England, where elements of self-assessment are already part of the CPA exercise and, hence, available for analytical appraisal. The reader will recall from the discussion in the first section that the final CPA score of English authorities during the CPA rounds 2002 2004 were the result of a combination of two assessment elements: the performance score/rating based on inspections and audits of services, and the ability-to-improve score based on a corporate and selfassessment. While the first element was the focus of our comparative analysis in the second section and (to a lesser extent) the third section, it is the second element that comes to the forefront now. The two graphs in Figure 2 depict a simple but very useful linear regression, by plotting authorities ability to improve scores against their actual performance at a later

Downloaded By: [Williams College] At: 14:28 30 May 2007

EVALUATING THE PERFORMANCE OF LOCAL GOVERNMENT

289

Downloaded By: [Williams College] At: 14:28 30 May 2007

FIGURE 2 Ability to improve scores versus CPA performance scores

290

DIRK HAUBRICH & IAIN MCLEAN

stage. To account for the time required for policy measures to have an impact on performance, a time lag of one year (t'/1) and two years (t'/2) is built into the model. Hence, the charts plot the self-assessed 2002 ability-to-improve scores against the absolute changes observed in CPA scores between 2002 and 2003 and between 2002 and 2004, respectively. The visual distribution patterns reveal that there is no relationship between selfassessment and improvement in performance scores, an observation borne out by the fact the regression lines actually tilt slightly downwards, indicating that the lower an authoritys ability to improve score, the better its improvement in subsequent years. The coefficient of determination R2 , which expresses the extent to which the regression function fits the set of data observations, is 0.0008 for the one-year model and 0.002 for the two-year model. Our analysis may be criticised on three grounds. First, it may be disapproved of for its disregard of ceiling issues in the model: authorities that attained the top performance category of 4 in the 2002 CPA round may be said to have been unable to improve their ratings any further in subsequent years, which may be skewing the resultant correlation coefficients. Yet, the model was careful to avoid this problem, by not using the four performance categories (one to four) as the dependent variable, but the performance scores . To recall, Table 2 showed that authorities performance scores can range between a minimum of 15 and a maximum of 60 points (or 12 and 48 points for county councils, which do not provide, and are therefore not assessed for, services in housing and benefits), which are then converted into category scores of between one and four. In order to establish an equal playing field between the two types of authorities, we then converted the performance scores into percentage scores. The top performance category of 4 was awarded to an authority if it achieved between 45 and 60 points (36 and 48 points for county councils). From the 148 authorities assessed in the 2002 CPA exercise, 21 achieved the top rating. Of those, 14 achieved further score improvements in 2003 or 2004, despite their achieving the top category rating in 2002 already, while a further seven either stagnated or dropped a category. Given such improvements, ceiling issues can therefore have been only a limited constraint. Rather, it appears that the authorities that did improve were not identical to those that previous CPA rounds evaluated as being able to improve. In order to eradicate any remaining doubts, we re-ran the regression model, this time without any of the 21 authorities that achieved the top 4 rating in 2002. As predicted, the resultant scatter plot (not shown) made hardly any difference to the regression line, although, barely visible to the eye, it tilted very slightly upwards. The correlation coefficients R2 continued to produce a low value of 0.003 for both scatter plots. The second objection that may be raised against our analysis is the claim that the lack of improvement is due to the moving of the goal posts by the Audit Commission in the years following the initial 2002 round. This is the argument we heard from some of our interview partners who commented that local authorities may have found it harder to obtain good ratings in 2003 and 2004, because the Commission hardened the

Downloaded By: [Williams College] At: 14:28 30 May 2007

EVALUATING THE PERFORMANCE OF LOCAL GOVERNMENT

291

requirements for authorities to improve. Yet, we found that, from the 148 authorities analysed:
. . . the number of poor and weak councils dropped from 34 in 2002 to 16 in 2004; the number of weak councils in 2004 dropped to 15, down from 21 in 2002; and at the top end of the spectrum, 101 councils achieved a rating of excellent or good in 2004, up from 76 in 2002.

Downloaded By: [Williams College] At: 14:28 30 May 2007

Given these figures, it is difficult to corroborate the claim that it became harder for authorities to improve their performance. Rather, it appears (again) that the authorities that did improve were not the ones that previous CPA rounds deemed able to improve. A third and final protest may be made by readers familiar with the more recent changes introduced in the CPA regime. They may want to point out that the issue raised here is no longer an issue, because in the most recent CPA round in 2005, ability-toimprove scores were dropped as a component of the assessment scheme. Yet, while it is true that the modifications introduced by the Audit Commission over the past years has resulted in CPA being a moving target that is not easily gauged by political analysts, reliance on self-assessment is an element that has not changed. For in 2005, ability-toimprove assessments were merely replaced by direction of travel statements and these continue to be based on authorities self-assessment, although now no longer used to calculate a performance rating. Quite clearly, then, the components in the English assessment regime that do rely on self-assessment produce scores and ratings that are bad indicators for the attributes the regime purports to measure. As a result, the methodology by which these selfassessed scores are derived appears seriously undermined, and so is by implication the validity of the assessment scheme. The benefits of organisational self-assessment in the public sector are well documented (Hakes & Reed, 1997 ), but if what the future holds for England includes a greater reliance on self-assessments of this kind, then the aspiration to produce valid and reliable (and hence meaningful) performance measures have to be dropped.

Conclusion
This article provides a comparative account of the performance frameworks against which local authorities in England, Wales and Scotland are assessed. Based on 20 semistructured elite-interviews held with auditors, auditees, and other stakeholders in the three nations, the respective pros and cons of each scheme were drawn out and a rationale provided as to why the devolved nations decided to introduce less prescriptive and intrusive variants that rely on authorities self-assessing their performance. Both Wales and Scotland contain fewer local authorities and so the relationship between auditor, local government department, and authorities can be more intimate than in England. The production of league tables is therefore rejected in both countries. It came as some surprise that the interview partners in Wales expressed great interest to introduce at least to some extent more prescriptive English style elements into the Welsh regime, as these would allow for more explicit comparisons to be made

292

DIRK HAUBRICH & IAIN MCLEAN

Downloaded By: [Williams College] At: 14:28 30 May 2007

between authorities performance, a feature that remains unattainable with the flexible Welsh approach. By contrast, we found no stakeholders in Scotland who voiced a similar desire, an observation attributable to the fact that the Welsh local government system was closely integrated with the English one until devolution in 1999, whereas the Scottish local government system never has been its distinctness being asserted in the Treaty and Acts of Union of 1707. The cross-fertilisation between Wales and England proceeds in the opposite direction as well. In an effort to gauge the advantages (with regard to assessment costs and buy-in from local authorities) of an assessment that is based on self-assessment, the ODPM, the Audit Commission, as well as local authorities appear to be leaning towards the substitution of CPA from 2008 onwards with a regime that relies much more heavily on self-assessment. The research findings also showed that with the current methodology, the self-assessed ratings and scores lose their validity as indicators of the underlying authority attributes of performance and/or ability to improve. Policy-makers need to be aware of such assets when they are trading them in for the scheme that will replace the current CPA.

NOTES
1. 2. For a detailed list of appointed auditors, see http://www.audit-scotland.gov.uk/audit/ appointments.htm See Audit Scotland (2006b), although the number of indicators can increase to more than 100 once non-statutory indicators (i.e. indicators determined by authorities) are included.

REFERENCES
AUDIT COMMISSION (2001) Strong Local Leadership Quality Public Services , London: Audit Commission. AUDIT COMMISSION (2004) Holding to Account and Helping to Improve A Strategic Statement for Public Audit in Scotland 2004 06 , Edinburgh: Audit Scotland. AUDIT COMMISSION (2005) The Framework for Comprehensive Performance Assessment for District Councils A Consultation Document , London: Audit Commission. AUDIT SCOTLAND (2006a) Evidence Given to Finance Committee on Governance and Accountability , Edinburgh: Audit Scotland. AUDIT SCOTLAND (2006b) Overview of the Local Authority Audits 2005 , Edinburgh: Audit Scotland. BOYNE, G.A., MIDWINTER, A.F. & WALKER, R.M. (2003) Devolution and Regulation: Political Control of Public Agencies in Scotland and Wales , Full Report of Award No. L327253035, Economics and Social Research Council, Swindon. ECONOMIST (2006) A funny thing happened on the way to the council, 25 February, Economist , p. 33. HAKES, C. & REED, D. (1997) Organisational Self Assessment For Public Sector Excellence , Bristol: Bristol Quality Centre.

EVALUATING THE PERFORMANCE OF LOCAL GOVERNMENT HAUBRICH, D. & MCLEAN, I. (2006) Assessing public service performance in local authorities through CPA a research note on deprivation, National Institute Economic Review , vol. 197, no. 1, pp. 93 105. HMSO (1999) Local Government Act 1999 , London: Her Majestys Stationary Ofce. HOOD, C., SCOTT, C., JAMES, O., JONES, G. & TRAVERS, T. (1998) Regulation Inside Government , Oxford: Oxford University Press. IMPROVEMENT AND DEVELOPMENT AGENCY (2006) Driving Improvement A New Performance Framework for Localities , London: IDeA. NATIONAL ASSEMBLY FOR WALES (2006) Local Government and Public Services Committee Meeting Minutes 9 Feb 2006 , Cardiff: National Assembly for Wales. OFFICE OF THE DEPUTY PRIME MINISTER (2005) Local Government Finance Statistics England No. 16 , London: ODPM. POWER, M. (1997) The Audit Society , Oxford: Oxford University Press. SCOTTISH EXECUTIVE (2005) The Local Government in Scotland Act 2003 * Best Value Guidance , Scottish Executive, Edinburgh, [online] Available at: www.scotland.gov.uk/Publications/ 2005/01/20531/50061 (accessed April 2006). STOKER, G. (2005) Transforming Local Governance , Houndsmill: Palgrave MacMillan. WALES AUDIT OFFICE (2006) Wales Programme for Improvement Annual Report 2004/5 , Cardiff: Auditor General for Wales.

293

Downloaded By: [Williams College] At: 14:28 30 May 2007

Dr Dirk Haubrich (corresponding author), Department of Politics and International Relations, Research Ofcer, Department of Politics and International Relations, Manor Road, University of Oxford, Oxford OX1 3UQ, UK. Tel: '44 1865 285977; E-mail: Dirk.Haubrich@politics.ox.ac.uk Professor Iain McLean, Professor of Politics, Department of Politics and International Relations, Nufeld College, University of Oxford, New Road, Oxford OX1 1NF, UK. Tel: '44 1865 278646; E-mail: Iain.McLean@nufeld.ox.ac.uk

You might also like