Using Data to Reduce Health Inequalities in NHS Community Services: From Dashboards to Action
NHS community services are increasingly expected to evidence inequalities work with more than narrative. The challenge is not a lack of data, but a lack of operational translation: teams can report activity and demographics while still failing to identify who is excluded, where pathways break down, and what changes improve outcomes. Reducing inequalities requires a data-to-action discipline that is embedded into pathway governance and day-to-day management. This article supports Health Inequalities, Access & Inclusion and aligns with NHS Community Service Models and Pathways, because meaningful insight must be tied to pathway steps: referral, triage, waiting, delivery, discharge and re-entry.
What “good” looks like: moving from reporting to operational intelligence
In inequalities work, the most useful questions are practical: Who is not reaching the service? Who drops out and why? Who waits longer? Which cohorts have poorer outcomes or higher risk escalation? Good practice links these questions to specific pathway decisions, then tests changes and measures the impact over time.
Operational intelligence typically combines: (1) access metrics (referrals, acceptance, waiting times, first contact); (2) engagement metrics (non-attendance, loss of contact, discharge reasons); (3) outcomes (functional change, crisis escalation, admission/readmission, safeguarding events); and (4) experience and complaints. Importantly, “inequalities” cannot be understood only through ethnicity coding. Deprivation, housing status, disability, language need, and carer status often predict access problems more strongly, so services should use the best available proxies while improving data quality over time.
Operational example 1: Identifying and fixing hidden exclusion through referral-to-first-contact analysis
Context: A community frailty pathway reported rising referral numbers and stable caseloads. However, place-based partners raised concerns that people in deprived neighbourhoods were still reaching services late and deteriorating before support began.
Support approach: The pathway introduced a “referral-to-first-contact” inequality lens, segmenting by deprivation quintile and by referral source (GP, discharge, self-referral where applicable).
Day-to-day delivery detail: Weekly performance huddles reviewed a simple run chart: time from referral received to first clinical contact, by cohort. The team found that referrals from high-deprivation postcodes were more likely to be incomplete and therefore “paused” in the admin queue. The service implemented a same-day referrer call-back process for incomplete referrals and introduced a short minimum dataset so referrals could be accepted provisionally while missing details were gathered.
How effectiveness or change is evidenced: Within eight weeks, the pathway reduced average referral-to-first-contact times for deprived cohorts and reduced the proportion of referrals paused for missing information. The service also tracked downstream indicators (avoidable A&E use within 30 days and unplanned admissions) to evidence that earlier contact correlated with reduced escalation.
Operational example 2: Using non-attendance and discharge data to reduce avoidable pathway churn
Context: A community MSK pathway had high “did not attend” (DNA) rates and routinely discharged after two missed appointments. Commissioners questioned whether this practice disproportionately excluded people with caring responsibilities, low income, or mental health needs.
Support approach: The service built a DNA inequality dashboard and replaced automatic discharge with a graded response model, supported by targeted appointment formats.
Day-to-day delivery detail: The team reviewed DNA rates weekly by time of day, location, deprivation, and known access barriers (where recorded). They introduced evening group sessions, telephone/video options for follow-up, and a “quick rebook” slot reserved for people who missed due to practical barriers. Clinicians recorded a standard reason for DNA after follow-up contact (transport, childcare, work, anxiety, language, other) to improve learning without blaming individuals.
How effectiveness or change is evidenced: The pathway monitored changes in DNA rate, rebooking success within 14 days, and the proportion of people discharged due to missed appointments. It also reviewed complaints and referrals back into the pathway within six months as a marker of avoidable churn. Improvements were evidenced through reduced DNA-linked discharge and fewer repeat referrals for the same condition.
Operational example 3: Combining outcomes and safeguarding intelligence to identify unequal risk exposure
Context: A community nursing and long-term conditions service observed that some people experienced repeated deterioration episodes and safeguarding concerns, including self-neglect and exploitation, but this was not visible in standard performance reports.
Support approach: The service introduced a “risk and inequality” review that combines clinical deterioration markers with safeguarding intelligence, segmented by deprivation and housing instability where recorded.
Day-to-day delivery detail: Monthly multidisciplinary panels reviewed a short list of cases flagged by criteria such as repeated missed visits, frequent urgent contacts, non-healing wounds, medication non-adherence linked to understanding barriers, or recurrent safeguarding referrals. The panel’s purpose was operational: identify what in the pathway is failing (communication, access, escalation thresholds, coordination with primary care/VCSE) and agree specific actions (more frequent planned contact, joint visits, revised care plans, escalation to MDT, safeguarding strategy meetings). Actions were assigned owners and reviewed at the next panel.
How effectiveness or change is evidenced: The service tracked repeat safeguarding concerns, urgent care contacts, and clinical deterioration events for flagged cohorts over time. It also audited whether agreed actions were completed and whether plans were understood, using record sampling and service-user feedback where feasible.
Commissioner expectation: Transparent inequality metrics linked to improvement actions
Commissioner expectation: Commissioners do not only want dashboards; they want evidence of learning and change. They expect services to show which inequality signals they monitor (access, waiting times, engagement, outcomes), how data quality is improving, and what pathway changes have been implemented in response to identified disparities. They also expect governance: named leads, review cadence, documented decisions, and re-measurement to confirm whether changes worked.
In practice, commissioner scrutiny often focuses on whether variation is explained away (“hard-to-reach”) or addressed through specific operational changes, with clear audit trails.
Regulator / Inspector expectation: Data is used to ensure safe, effective, equitable care
Regulator / Inspector expectation (CQC): CQC expects providers to understand their population and to use information to improve outcomes and experiences, including for people who may be disadvantaged. Inspectors look for evidence that services can identify unmet need, respond to unequal outcomes, and manage risks linked to exclusion (missed contact, misunderstanding, delayed care, safeguarding exposure). They also look for assurance that action is taken and sustained, rather than isolated improvement projects.
Governance routines that make data-to-action defensible
To be credible, data must be governed like any other quality domain. Strong practice includes:
- A defined inequalities measurement set with operational owners (not “owned by analytics”).
- Regular performance huddles that focus on pathway steps, not only volume.
- Documented improvement cycles (problem, change, measure, review, embed).
- Data quality plans for protected characteristics, communication needs and access barriers.
- Integration of inequality insight into safeguarding governance, incident review and complaints learning.
Most importantly, services should be able to show the “line of sight” from data signal to operational change to measured impact. That auditability is what gives inequalities work long-term value for commissioners, tender teams and regulators.