What Commissioners Actually Want to See in Social Value Method Statements
Most adult social care providers understand that social value is now a mandatory and increasingly scrutinised part of modern tendering. Yet many still lose marks because their answers rely on broad promises rather than evidence of delivery. Practical guidance across the Social Value knowledge library and the related Social Value Measurement & Reporting guidance series points to the same conclusion: local authority and NHS commissioners usually score social value method statements more highly when they show clear local relevance, measurable outcomes, realistic delivery arrangements and visible governance rather than general statements about being community minded or values led.
Why this still trips providers up
Social value is now familiar language in tenders, but familiarity has not necessarily improved quality. Many providers know they need to mention community benefit, sustainability, equality or local employment, yet their method statements still read as if social value sits outside the service model. Commissioners can usually spot that quickly. If the answer looks copied from a generic corporate template or if the examples could apply to any contract in any area, confidence drops.
In adult social care, commissioners are rarely looking for the broadest list of promises. They are usually looking for evidence that the provider understands the commissioning area, the people using services and the wider pressures affecting the local system. A strong answer shows how the provider’s service will create additional value that is relevant, measurable and deliverable. A weak answer often sounds positive but remains hard to verify and even harder to monitor after award.
What commissioners actually want to see
Specific local impact is one of the clearest things commissioners tend to value. They usually want to know how the service will benefit the area they are procuring for, not just how the provider behaves generally as an organisation. In practice, that may mean local recruitment, local spend, links with schools or colleges, partnerships with charities or community groups, and activity that strengthens the local support ecosystem around people using services. The more clearly the answer is rooted in the commissioning geography, the more credible it tends to feel.
Measurable outcomes are equally important. Claims such as reducing isolation, improving wellbeing or strengthening community connection sound positive, but commissioners usually want to know how those claims will be evidenced. That could include the number of volunteering placements supported, digital inclusion sessions delivered, carers engaged, training places created, local jobs filled or partnerships established and maintained. Good method statements explain both the activity and the metric.
Clear alignment with commissioner priorities is another frequent differentiator. Local authorities and NHS-linked commissioners often publish priorities through social value policies, local plans, strategy documents or wider needs assessments. Providers do not need to quote every framework at length, but they do usually benefit from showing that they have read the relevant priorities and shaped their commitments accordingly. This is often where social value answers start to feel tailored rather than generic.
Inclusive recruitment practices also tend to matter. Adult social care is a workforce-intensive sector, so commitments around apprenticeships, local recruitment, lived experience roles, disability inclusion, return-to-work opportunities or pathways for people facing barriers to employment can be particularly persuasive. These commitments usually score better when they are tied to actual recruitment practice, named delivery routes and realistic progression support.
Collaborations and co-production are often valued too, especially where they go beyond tokenistic references to partnership. Commissioners usually respond better when a provider explains which organisations or groups it works with, what those relationships achieve and how people with lived experience or community stakeholders influence delivery. In adult social care, this can be especially relevant where the service is expected to support inclusion, prevention or broader community wellbeing.
Operational example 1: local employment and skills in domiciliary care
A domiciliary care provider bidding for a borough-wide contract knew that workforce fragility was a major local issue. Instead of saying simply that it would create jobs, it shaped its social value answer around local employment, training and retention. The context included high vacancy pressure, travel challenges and the commissioner’s concern about continuity of care.
The support approach involved advertising locally, working with employment services and creating a progression route from induction into senior care roles. Day to day, managers tracked how many recruits came from the borough, how many completed induction and how many stayed beyond probation. The provider also monitored whether staff moved into enhanced roles through supervision and development review.
Effectiveness was evidenced through stronger retention, fewer rota disruptions and a higher proportion of staff recruited from the local area. This scored well because the provider showed not just a social value promise, but a local workforce benefit that also supported safer contract delivery.
Operational example 2: measurable community impact in supported living
A supported living provider wanted to evidence social value around reducing isolation and improving participation. The context involved adults with learning disabilities who were at risk of limited community access unless support was intentionally designed around inclusion and local connection.
The support approach included partnerships with local groups, support into volunteering and structured goal-setting around community participation. Day to day, support workers recorded participation goals in support plans, managers reviewed progress during service reviews and leadership examined outcomes data alongside service-user feedback. The provider avoided vague language such as “we help people engage with the community” and instead described how many people would be supported into specific activity and how that would be reviewed.
Effectiveness was evidenced through increased volunteering participation, stronger feedback from people using services and more consistent records of meaningful local engagement. The answer was stronger because it combined narrative, metrics and a clear delivery model.
Operational example 3: partnership and environmental evidence in a residential service
A residential and outreach provider knew that national charity fundraising examples would not be enough to impress a local commissioner. The context included a procurement where sustainability, community engagement and local relevance were all important, but the provider also needed to show proportionality and operational realism.
The support approach focused on local partnerships and practical environmental measures. The provider described how it worked with nearby organisations to expand activity opportunities and how it reduced unnecessary travel through better outreach planning. Day to day, managers tracked the number of active local partnerships, reviewed participation opportunities created for residents and monitored mileage reductions linked to route changes. Governance meetings included social value review so that underperformance could be identified and addressed.
Effectiveness was evidenced through improved community activity opportunities, lower travel use in some parts of the service and stronger local partnership records. This made the answer much more persuasive than a generic statement about “supporting the environment and giving back to the community”.
What tends to get ignored or score poorly
Generic statements are one of the biggest weaknesses in social value method statements. Phrases such as being committed to the community or passionate about making a difference sound positive but do not usually score well unless they are backed by examples, named activity and clear local relevance. Commissioners generally want to know how, where and with whom the benefit will be delivered.
Uncosted promises also tend to weaken answers. If a provider says it will deliver additional initiatives at no cost but does not explain how those activities will be resourced, staffed or governed, the answer can look superficial. In adult social care, where margins and workforce pressure are real issues, overpromising can reduce credibility rather than strengthen it.
Unrelated charity donations may be admirable but often score poorly if they do not clearly benefit the commissioning area or connect to the service being procured. Commissioners are usually interested in social value that supports their locality and their population, not just a provider’s national charitable profile.
One-off activities also tend to be less persuasive. A single event or isolated activity is unlikely to count as strong social value unless it forms part of a wider, ongoing approach. Commissioners usually want to see sustainability and repeatability rather than one-off gestures.
How to write the method statement more effectively
A strong method statement is usually structured, selective and evidence led. Headings such as Economic, Social and Environmental can help if they match the commissioner’s framework, but the content beneath them needs to stay specific. Providers often score better when they include baseline information where available, describe projected outcomes carefully and explain how delivery will be monitored and reported.
It also helps to ask a practical question while drafting: would this example convince a commissioner that the provider understands this area and its priorities? If the answer is no, the example probably needs refining or replacing. Local relevance nearly always strengthens social value answers in adult social care because the service itself is delivered in a particular place, with particular pressures and community relationships.
Commissioner expectation: local, measurable and proportionate delivery
Commissioner expectation: Commissioners are likely to expect social value method statements to show clear local impact, measurable outcomes, realistic delivery arrangements and good alignment with local priorities. In adult social care, stronger answers usually explain what will be delivered, why it matters in that area, how it will be resourced and how progress will be reported over the life of the contract.
Regulator / Inspector expectation: wider benefit should sit within a well-led service
Regulator / Inspector expectation: Although social value is assessed primarily through procurement and contract management, the commitments still need to sit within a safe and well-led service. If a provider makes attractive promises that are weakly governed or disconnected from operational reality, those commitments are less credible. Strong social value usually comes from services where leadership, reporting and quality oversight are already disciplined.
Why this matters in competitive tenders
In adult social care procurement, social value can influence the result in a meaningful way, especially where technical scores are close. Providers that understand what commissioners are actually looking for can use the method statement to reinforce wider strengths in governance, workforce planning, community partnership and local credibility. Providers that treat it as a generic compliance section often leave marks on the table.
The strongest social value method statements are rarely the most dramatic. They are usually the clearest, the most locally grounded and the easiest to believe. That is what commissioners tend to reward, and that is why evidence, structure and relevance matter so much.
Latest from the knowledge hub
- How CQC Registration Applications Fail When Equipment, PPE and Supply Readiness Are Not Operationally Controlled
- How CQC Registration Applications Fail When Quality Audit Systems Exist but Do Not Drive Timely Action
- How CQC Registration Applications Fail When Recruitment-to-Deployment Controls Are Not Strong Enough
- How CQC Registration Applications Fail When Staff Handover and Shift-to-Shift Communication Are Not Operationally Controlled