Measuring Economic Social Value in Adult Social Care: KPIs, Dashboards and Evidence

Economic social value in adult social care is increasingly judged not by the ambition of the commitment but by the quality of the evidence behind it. Commissioners want providers to show that local spend, supplier diversity and community-facing procurement commitments can be measured, reviewed and improved over time. Stronger responses therefore place these arrangements within practical economic social value and local spend reporting, while also aligning them with broader social value policy and national priorities around accountability, responsible procurement and local economic resilience. In adult social care, this matters because social value claims often weaken once commissioners ask what data will actually be available during mobilisation, contract review and provider assurance meetings.

For many providers, the challenge is not lack of activity but lack of a reporting framework. They may already use local suppliers, support smaller enterprises or keep spend within the area, yet still struggle to evidence this clearly. Without consistent data, local spend can become anecdotal. Without governance, performance may drift. And without explanation, even strong figures may not persuade commissioners if they are not linked to outcomes, risk management and operational relevance. That is why economic social value needs measurement systems that are simple enough to use but strong enough to withstand scrutiny.

Why measurement matters in economic social value

Commissioners increasingly understand that percentages on their own tell only part of the story. A high local spend figure may sound impressive, but if it is concentrated in a low-impact category or unsupported by assurance processes, it may mean little in practice. Similarly, a lower but well-governed figure may be more credible if the provider can explain how it is built, what categories are included and what improvements are planned.

Measurement matters because it helps providers move from intention to evidence. It also improves internal leadership grip. When local procurement, supplier performance and economic value data are visible to managers, decisions become more deliberate. This makes it easier to protect commitments during operational pressure and easier to explain performance honestly during contract meetings.

Commissioner Expectation: providers should evidence economic value through clear metrics

Commissioner expectation: Providers should demonstrate how economic social value commitments will be measured, reviewed and reported in a way that is specific to the contract.

Commissioners often want to know what indicators will be tracked, how frequently reports will be produced and whether the provider can distinguish between headline claims and contract-relevant delivery. They are usually more persuaded by a small number of clear, repeatable indicators than by long lists of vague aspirations. Good evidence frameworks also show how performance will be interpreted, not simply collected.

Regulator Expectation: governance information should support well-led decision-making

Regulator expectation (CQC): Providers should use governance information to identify risk, maintain oversight and support safe, effective service delivery.

CQC does not typically require an “economic social value dashboard”, but governance quality is highly relevant. If procurement decisions affect continuity, quality or resource availability, leaders should know. A provider that can show local procurement data alongside assurance measures is often better placed to demonstrate that contracts are being delivered responsibly and with appropriate oversight.

What KPIs commissioners often find useful

The most practical economic social value KPIs usually combine local spend with assurance and delivery context. Typical examples include percentage of eligible non-pay spend with local suppliers, number or proportion of active local suppliers by category, spend with SMEs or VCSE partners, response time performance where local suppliers affect service continuity, and proportion of contract-linked procurement reviewed against social value criteria.

These figures become more useful when accompanied by narrative context. Providers should explain how “local” is defined, which spend categories are included, which remain out of scope and why. This helps avoid confusion and makes reporting much more credible.

Operational example: building a local spend dashboard

A supported living provider found that it could not answer simple commissioner questions about local spend because procurement data sat across finance, services and legacy supplier records. The organisation introduced a quarterly dashboard covering non-pay spend categories, supplier location, category type and whether the supplier was local, regional or national.

The support approach was deliberately practical. Finance coded suppliers by postcode area, procurement leads reviewed exceptions and service managers discussed local opportunities where categories had drifted away from plan. Day to day, this created much better visibility. Effectiveness was evidenced through more accurate reporting, a modest increase in local supplier use in suitable categories and better leadership discussions about where local value could be expanded safely.

Operational example: combining local spend and resilience indicators

A residential care provider realised that reporting local spend alone did not tell commissioners whether those procurement choices were helping or harming service performance. It therefore added resilience indicators to its dashboard, including delivery timeliness, supply disruption incidents and repeat service issues by category.

This meant the provider could show not only that some spend had shifted locally, but that local arrangements were also performing well in operational terms. In one category, local sourcing improved response times significantly. In another, a supplier needed to be replaced after repeated stock issues. The dashboard made those decisions visible and defensible, strengthening both contract assurance and social value reporting.

Operational example: measuring VCSE engagement in community-based care

A homecare provider working with local VCSE organisations wanted to evidence economic social value beyond straightforward spend figures. Some community partnerships involved modest financial value but significant local impact. The provider therefore tracked number of commissioned VCSE relationships, categories of support provided, referral outcomes and continuity of engagement over time.

This broader evidence helped show that the provider’s economic social value model was not only about procurement volume, but about how local contract value circulated through community organisations that supported prevention and inclusion. Commissioners responded positively because the evidence linked purchasing decisions to real local service benefit.

How to build a useful evidence framework

A good evidence framework usually starts with definition. Providers need to be clear about what counts as local spend, which supplier categories are in scope and what reporting period will be used. They then need a manageable set of indicators that can be updated regularly without excessive burden. Dashboards should support decision-making, not exist only for presentation.

Review matters as much as measurement. Stronger providers discuss economic social value through procurement meetings, contract performance reviews or leadership forums, not just in annual reports. This makes it easier to spot drift early and adjust plans before commitments weaken.

Governance, transparency and honest reporting

Economic social value reporting becomes much stronger when providers are transparent about limitations. Some spend categories may remain national because of regulation, specialist supply requirements or continuity risk. Honest reporting of this does not weaken the provider’s position. It usually strengthens it because commissioners can see that the organisation understands its own supply profile and is not overstating what is possible.

Dashboards should therefore support explanation as well as numbers. A provider that can show where value is being created, where gaps remain and what next steps are planned often appears more credible than one offering a single headline figure with no narrative behind it.

Why this strengthens tenders and contract reviews

When providers can measure economic social value clearly, their tender responses usually improve because commitments sound more operationally believable. After award, the same framework supports mobilisation, contract review and social value reporting. This continuity is important. It helps prevent the familiar problem of strong tender promises followed by weak contract evidence.

Ultimately, KPIs, dashboards and evidence frameworks make economic social value real. In adult social care, providers who can measure local spend and supply chain value clearly are better placed to show commissioners that procurement choices are supporting local resilience, good governance and safe, sustainable service delivery over time.