Measuring and Evidencing Outcomes in SME and VCSE Partnerships for Commissioners
SME and VCSE partnerships can add clear value in adult social care: wider community reach, specialist capability, and practical support that complements regulated provision. However, commissioners increasingly expect providers to evidence outcomes rather than describe activity. “We run sessions” or “we signpost people” is rarely sufficient; the provider must show what changed, for whom, and how this improvement is sustained and governed.
This article supports the SME, VCSE & Social Enterprise Engagement approach and aligns with social value expectations that benefits are demonstrable, auditable and linked to delivery reality.
Why outcome evidence is harder in partnership models
Partnership outcomes often sit across organisational boundaries. The partner may deliver activity, but the provider holds the care plan, the regulated environment and the safeguarding framework. Without agreed measurement methods, outcomes become anecdotal or inconsistent, which creates risk in commissioning review and inspection contexts.
Typical challenges include:
- Partners collecting data in different formats or not at all
- Outcomes defined too broadly to be meaningful
- Activity measures mistaken for impact
- Weak links between partnership delivery and care planning
Defining outcomes that make sense to commissioners and services
Outcome definition should start with what commissioners and operational leaders actually need to know. Outcomes must be specific enough to evidence change but practical enough for frontline teams and partners to record consistently.
In adult social care partnership contexts, outcomes often fall into three categories:
- Individual outcomes: changes for a person (skills, wellbeing, safety, independence)
- Service outcomes: changes for the service (reduced incidents, improved engagement, fewer breakdowns)
- System outcomes: changes across pathways (reduced escalation, improved access, better transitions)
Providers should avoid relying on a single metric. A small set of measures, linked to delivery reality, is usually stronger than complex dashboards that partners cannot maintain.
Building a proportionate outcome framework
Effective outcome frameworks typically include:
- A clear baseline and review cycle (e.g. at start, 6–12 weeks, quarterly)
- Simple outcome indicators tied to the purpose of the partnership
- Qualitative evidence (case examples) that show the mechanism of change
- Clear governance: who checks quality, who reviews themes, who acts on learning
Where partnership delivery intersects with care planning, outcome evidence should be visible in review notes, risk assessments and support plans. This is often what inspectors look for: whether the work sits within the service’s governance framework, not in a separate “partner report” that frontline staff never use.
Operational example 1: VCSE employment support partnership
Context: A VCSE delivered employment readiness support to adults with learning disabilities supported in regulated services.
Support approach: Outcomes were defined as measurable steps: attendance, skills gained, work trials initiated, sustained engagement, and reduction in anxiety-related incidents linked to routine.
Day-to-day delivery detail: Staff supported people to attend sessions, practised travel training, and reinforced employability routines in the home. The VCSE provided short progress updates aligned to the provider’s review cycle.
How effectiveness or change is evidenced: Evidence included review documentation linking engagement to wellbeing improvements, reduced incident logs during weekdays, and case examples showing specific progress and barriers addressed.
Operational example 2: SME specialist training partner and incident reduction
Context: An SME delivered specialist training and coaching for staff supporting people with behaviours of concern.
Support approach: Outcomes were defined at service level: competency sign-off rates, reduction in specific incident types, and improved quality of de-escalation recording.
Day-to-day delivery detail: Coaching observations were scheduled across shifts. Managers embedded learning into handovers and supervision, using reflective discussion and practice feedback.
How effectiveness or change is evidenced: Evidence included incident trend data, improved staff confidence metrics, and audit samples showing stronger recording of triggers, early intervention and post-incident learning.
Operational example 3: VCSE inclusion partnership and safeguarding confidence
Context: A VCSE delivered community inclusion sessions where participants sometimes disclosed exploitation risk.
Support approach: Outcomes included increased engagement and community participation alongside safety indicators: timely escalation, improved boundary awareness, and reduced repeat safeguarding concerns.
Day-to-day delivery detail: Facilitators completed brief session records and used an agreed escalation route for safeguarding indicators. Provider staff reinforced safety learning in keywork sessions and reviewed risks in care planning.
How effectiveness or change is evidenced: Evidence included documented safeguarding escalations with clear timescales, updated risk assessments, and quarterly reviews showing how learning was embedded across partner delivery.
Commissioner expectation
Commissioner expectation: Providers must evidence that partnership delivery achieves measurable outcomes aligned to commissioned priorities, with clear baselines, review cycles and credible reporting that supports assurance.
Regulator / Inspector expectation
Regulator / Inspector expectation (e.g. CQC): Providers must demonstrate that partnership activity is effectively governed, contributes to person-centred outcomes, and is integrated into risk management, safeguarding and quality assurance systems.