Arts Council England re-label &-re-tender
Photo: ebayink on VisualHunt / CC BY-NC-ND.
Liz Hill, Quality Metrics to go ahead under new name, Arts Professional, 5 December 2017
Although a supplier was chosen to deliver Arts Council England’s £2.7m quality evaluation programme, the procurement process is now being re-run after falling foul of EU rules.
Arts Council England (ACE) is set to re-launch the tender to deliver its controversial Quality Metrics framework under the new name of ‘Consumer Insight Toolkit’, which according to CEO Darren Henley will “better reflect what [the system] will do”.
The original tender, for a framework capable of measuring and benchmarking the quality and impact of arts and cultural activities, was won by supplier Counting What Counts Ltd, but the contract fell at the final hurdle in the face of breaches in EU procurement rules.
Following the collapse of the original tender process, ACE has now issued outline details in advance of a new invitation to tender for a similar system, which will be published early next year. The contract will be advertised at £2.3m and will run for four years from June 2018.
The original invitation to tender, the largest ever procurement exercise run by ACE, was worth £2.7m. Deeply unpopular among many artists and arts organisations, it became especially controversial when the system ACE specified in the tender was based on ‘Culture Counts’, the quality measurement framework used by the company Counting What Counts Ltd.
The company had been funded by ACE over the previous four years to develop and pilot the system among National Portfolio Organisations (NPOs). Under EU rules, contracting authorities, such as ACE, can consult with potential suppliers before a procurement process begins, but not if “the design of the procurement is made with the intention of unduly favouring or disadvantaging certain economic operators”.
The tender was subsequently won by Counting What Counts Ltd, but despite this decision having been endorsed by ACE’s National Council at their meeting in May 2017, this was never publicised and the contract was never awarded.
ACE remains tight-lipped about the issue, initially refusing to confirm or deny that Counting What Counts Ltd had won the tender. Henley subsequently told AP that the details of the ‘successful bidder’ had been uploaded to ACE’s website in error and “should have been redacted from the minutes of the National Council meeting.” The minutes were then taken down from the website.
ACE has also rejected a request under Freedom of Information for related documents and correspondence with the European Commission, the OJEU and Counting What Counts Ltd.
The new Consumer Insight Toolkit will carry all the controversial hallmarks of the system that was specified under the original tender process. The requirement is still for a standardised evaluation system, designed around a set of pre-determined ‘Quality Metrics’.
Initially more than 250 of ACE’s larger NPOs will be required to adopt the Toolkit, which will be used to measure and benchmark their artistic output. More NPOs will be encouraged to use the system in its second year.
The data they collect will be collated to enable them to compare their opinion of their own work with the views of peers and the public, and will also be made available to ACE at both aggregate and organisational level.
Metrics in question
Coinciding with the announcement of the new tender, ACE has published its delayed report on the Accessibility of Language of Quality Metrics and Participatory Metrics.
Following the Quality Metrics pilot, an evaluation of the system found that the language and terminology being used to elicit public responses was inaccessible for some respondents, especially children and young people; people with disabilities; those whose primary language is not English; and people from lower socioeconomic groups.
The new report by Consultants Shared Intelligence, The Mighty Creatives and Sarah Pickthall, recommends that eight of the original nine Quality Metrics should be reworded to make them accessible to all, and adaptations are suggested for a further fourteen of the proposed Participatory Metrics. The authors also call for a “set of explainers” to be available to respondents at the point of data collection to be sure to “generate consistent and comparable data”.
The Scottish approach
The creators of Culture Counts have global ambitions for their evaluation framework, inviting the sector to “imagine a scenario where cultural organisations around the world are measuring the quality of their work using a standardised set of metrics that they themselves have shaped.”
Their system, from which the ‘Quality Metrics’ system evolved, has been trialled not only in England, but also with arts and cultural organisations around the world. It has not, however, been widely adopted.
Creative Scotland has used Culture Counts to evaluate the Glasgow Festival 2014, part of the Commonwealth Games cultural programme. It has since established a new quality Review Frameworkfor use by its Regularly Funded Organisations (RFOs), with a view to promoting “open dialogue with [them] about the quality of their artistic and creative work”. But it has not chosen to adopt Culture Counts or any other quantitative approach to this work.
Philip Deverell, Director of Strategy for Creative Scotland, told AP that their new Framework aims to help Creative Scotland and the sectors it works with “to adopt more of a common language around artistic and creative quality, in order to support a culture of continuous improvement.” It is a system that combines self-review by organisations, independent expert peer review, and review by a Creative Scotland Lead Office – but no survey of public opinion.
The system is currently being evaluated, and Creative Scotland is investigating the development of a ‘toolkit’ version which will share the principles of the Framework with the wider creative and cultural sector.
Abi Gilmore, Senior Lecturer in Arts Management and Cultural Policy at the University of Manchester, was involved as a research partner in the development stage of Quality Metrics, and more recently was a co-contributor to research exploring the UK and Australian experience of Culture Counts. This involved a cross-national comparison of the utility of the metrics.
Gilmore is circumspect about the role that metrics can play, believing that they can work to “reinforce art forms which are already prioritised by funding.” She told AP: “Our research exposes the politics of evaluation in any policy context. We found that using metrics shores up institutional tastes and values in a way that excludes the potential creation of public value through richer understanding of arts experience.”
Others appear to be equally sceptical. Alexis Andrew, Director of Research, Evaluation and Performance Measurement Section at the Canada Council told AP they have decided not to implement any tools that attempt to systematically measure qualitative impact, and are “not proceeding with a pilot of a qualitative assessment tool at this time for a number of operational and contextual reasons.” Instead, they have engaged a US research consultancy, to work with them “in researching and designing a framework for qualitative impact.”
Although Culture Counts has become a formal part of the artistic assessment process in the state of Western Australia, where it was initially developed with support from the Department of Culture and the Arts, the Australia Council for the Arts is not trialling it and “has no plans to at this stage.” A major trial of the system was evaluated by the Department of Management at Deakin University for Creative Victoria in the state of Victoria. Sources have told AP that they are not rolling out Culture Counts to all funded organisations, though no one at Creative Victoria was available to confirm this.