Blog

Public Sector Executive virtual event: Using data to drive efficiencies


by Pamela Cook

08 Nov 2021

image shows panel members headshots for the PSE event

Recently, I spoke on the panel at the PSE event – Digital Transformation: Using data to drive efficiencies. I was joined by Richard Walker, Partner for Data and Insights at Agilisys, Ritchie Somerville, Head of Strategy at the Data-Driven Innovation Initiative at The University of Edinburgh, and Bill South, Head of SRS Data & Governance at the Office for National Statistics.

The event shone a spotlight on the improved understanding in the public sector about the importance of data and the ways it can be used. The recent acceptance of shortfalls in data, insights, and knowledge is positive, as it leads to the appreciation that fixing those data problems can help to make more empowered and strategic decisions that will better support their communities.

 

Covid-19 pandemic: exposing the good and the bad

The need to respond in a quick and agile way in response to the pandemic, has accelerated digital transformation in some areas of the sector and, and as Richard Walker noted, brought the value of data to the forefront of public awareness. There have been some shining examples that showed what can be achieved through rapid innovation, adaptation, and collaboration. However, we’re still a long way from widespread adoption of best practice in when it comes to data use across the industry, and getting the basics right is the vital first step.

Three common barriers to successful use of data were highlighted by the pandemic and raised by Richard at the event: lack of access to data, inability to share it quickly with those who needed it, and a lack of consistent data standards. To unlock the value that data can bring, and start using it to drive efficiencies, the public sector needs to work towards overcoming these barriers in a more systematic way. (Read our joint article with Agilisys giving some practical tips to overcome data sharing barriers).

 

Public trust is the lifeblood of successful data sharing

When data sharing is possible, it unlocks a new level of value that can be leveraged to drive efficiencies and better support communities across the UK. Bill South highlighted how sharing data from different sources allowed the ONS to gain new insights to build a much richer understanding of the health and social impacts of the pandemic on citizens.

However, a central tenant of data sharing advocated by all on the panel was the importance of public trust, which can make or break the public sector’s ability to get the most value from their data. How to secure public trust and, in turn, citizen engagement?

  1. Data should be used and shared responsibly – data should only be used for the purposes it was collected for, access is granted ethically, and sensitivities are respected.
  2. Data should be used and shared for the public good – anonymous and publicly available data should be formatted to make it more accessible, and all data sharing initiatives should have the ultimate end goal of improving the lives of citizens.
  3. We should be communicating with citizens – there needs to be a systematic approach to communicating with communities about how and why their data is used, to improve transparency and engage with them about their concerns.
  4. We should be demonstrating the value to citizens – citizens should have an understanding about how they and their communities will benefit from sharing their data.

 

Building the right foundation to drive costs efficiencies

So how can local governments drive costs efficiencies, at a time when budgets are tighter than ever? A prime strategy to drive the greatest efficiencies was posed by Richard Walker. This was by predicting problems ahead of time and creating proactive solutions. He gave some examples of public sector organisations starting to think in this way, for instance by predicting on an individual level when citizens care needs will escalate and thus identify the point that costs of servicing those care needs will increase.

This type of approach requires analysts, machine learning, and AI technologies to analyse the data and make accurate predictions. Therefore, a point raised by several of the panel was that the success of these initiatives relies completely on the quality of the data that feeds into them. Creating a strong data foundation is fundamental to the ability of any public sector organisation to drive the greatest costs efficiencies.

An example I gave was a local authority we have worked with at Infoshare that wanted to analyse vulnerability across their community. Before they could analyse their data, they worked with local health organisations, police forces and three other bordering authorities to pool and accurately tag data at an individual level with over 140 vulnerability markers. This solid data foundation enabled analysts to accurately predict vulnerability hotspots and arm decision-makers with the information they needed to target interventions and budget more appropriately.

 

Can local authorities use their data to generate revenue?

Another point of discussion was about how the public sector could potentially use their data to generate revenue. Richard Walker posed an area of opportunity for the public sector is leveraging private investment, by engaging in joint ventures with private sector organisations who want to access public sector data. This would help the taxpayer see a return on the costs of managing and improving the data held by local governments. This will surely be a growing area of interest for local government over the coming years.

 

The webinar ended with our fantastic host Helen Fospero asking Ritchie Somerville for his advice for local governments. His answer focused on the importance of creating a strong and accurate data foundation to enable innovation to have impact and meaning:

“The thing you need to start – this afternoon – is having a robust data catalogue. Catalogue what data you have what’s the quality of it, because you’re probably making really good decisions based on imperfect data at the moment. You could make even better decisions if the quality of the data was improved. If you’re going to move to machine learning and AI and all these things, you’re going to have to get that stuff right.”

 

Watch the full video here