Data, Outcomes and Evidence Commissioners Actually Trust in Older People’s Services
In older people’s services, data is everywhere but trusted evidence is not. Commissioners regularly see dashboards filled with figures that lack context, explanation or linkage to frontline delivery. Inspectors often encounter similar issues: services can state outcomes but struggle to show how they are achieved or how leaders know they are sustained. Two useful internal reference points are the Working With Commissioners, ICBs & System Partners tag and the Social Care Mini-Series — Tendering, Safeguarding & Person-Centred Practice. This article sets out how to present data and outcomes in ways commissioners and inspectors actually trust.
Why commissioners are sceptical of provider data
Commissioners are rarely sceptical because data is poor; they are sceptical because it is disconnected. Common issues include:
- KPIs presented without explanation of variance or action.
- Outcome claims that are not linked to daily support activity.
- Audit results reported without evidence of practice change.
- Positive trends shown without acknowledgement of risk or limitation.
Trusted evidence shows balance. It demonstrates improvement while openly acknowledging pressure, risk and what is still being worked on.
What commissioners usually mean by “good evidence”
In older people’s services, commissioners typically look for three things:
- Line of sight: how frontline activity leads to outcomes.
- Grip: how leaders monitor, challenge and verify performance.
- Learning: how incidents and deterioration lead to improvement.
This means evidence must combine quantitative data with qualitative explanation and verification.
Building an evidence framework that holds up
1) Use trends, not snapshots
Single-month figures rarely persuade. Commissioners prefer trend data over time, with commentary that explains movement. For example, a rise in falls should be accompanied by analysis of location, time, staffing mix and actions taken, rather than reassurance alone.
2) Link KPIs to operational controls
Every headline measure should connect to a control mechanism. Falls link to mobility plans and handover checks; medication errors link to MAR audits and competency sign-off; safeguarding concerns link to escalation logs and supervision focus.
3) Show verification, not intention
Plans and policies are not outcomes. Commissioners and inspectors want to see how leaders check that changes actually happened through spot checks, observation and re-audit.
Operational examples: data that earns trust
Example 1: Falls reduction linked to practice change
Context: A service reports a 25% reduction in falls over three months following an increase earlier in the year.
Support approach: The provider presents falls data alongside a narrative explaining why falls increased and what was done.
Day-to-day delivery detail: Analysis showed falls clustered during evening routines. The