Co-Produced Quality Assurance in Mental Health Services: Turning Lived Experience into Governance and Improvement
Many services describe co-production in broad terms, but it becomes most convincing when it is embedded into quality assurance and governance — the “how we know” of service delivery. When lived experience shapes audits, incident learning and feedback loops, services are more likely to spot the real drivers of dissatisfaction, escalation and disengagement. This article focuses on practical mechanisms that registered managers and operational leads can implement, grounded in the Co-Production, Lived Experience & Personalisation topic area and aligned to core delivery infrastructure within Service Models & Care Pathways.
Why co-produced quality assurance is different from “engagement”
Co-produced quality assurance is not a consultation exercise and it is not the same as routine satisfaction surveys. It is a structured way of testing whether the service is delivering what it says it delivers, through the lens of the people most affected by the system: people receiving support and those close to them. In mental health services, quality problems often show up as relational drift: staff become rushed, boundaries become rigid, plans become generic, and restrictive practice becomes “the default” under pressure. These are not always visible in compliance audits, but they are visible in lived experience.
When co-production is built into governance, it strengthens risk management rather than weakening it. It helps services identify early signals (what triggers escalation, what creates withdrawal, what causes people to stop engaging) and convert those signals into practical improvement actions that are tracked and reviewed.
What to build: a co-produced assurance framework that is safe and usable
A practical co-produced quality assurance framework usually includes:
- A defined panel or reference group with clear membership, boundaries, confidentiality and support arrangements.
- Accessible feedback routes (not only digital) and a method for capturing themes, not just individual comments.
- Co-produced audit tools that test “experience and practice,” not just documentation.
- Learning mechanisms connecting incidents, complaints and feedback to actions, training updates and supervision themes.
- Visible action tracking so people can see what changed as a result of what they said.
Critically, this needs safeguards: contributors should not be exposed to distressing detail without support, and staff should not feel attacked. The goal is shared learning and improvement, under governance control.
Operational example 1: co-produced audits that test everyday experience
Context
A supported accommodation service has good compliance audit results, but commissioners raise concerns about complaints and inconsistent engagement. People report that staff knock and enter without waiting, use overly clinical language, and make decisions “in the office.” The service wants an audit method that captures lived experience reliably.
Support approach
The provider develops a co-produced “everyday experience audit” with lived experience partners. The audit tool focuses on routine moments that define respect and autonomy: privacy, choice, communication, involvement and repair after conflict.
Day-to-day delivery detail
- The audit is completed monthly on a rolling basis across shifts (including evenings and weekends), led by a manager with a lived experience partner involved in tool design and review.
- Auditors use observation prompts (e.g., knock-and-wait practice, tone used when setting boundaries, whether choices are offered) and short informal check-ins with people using the service.
- Findings are summarised into themes, with “what good looks like” examples included to reinforce positive practice.
- Action plans are created with named owners and timescales, then reviewed at governance meetings.
How effectiveness is evidenced
Effectiveness is evidenced through repeat audit scores, reduction in recurring themes (e.g., privacy breaches), and improvement in qualitative feedback. The provider keeps a clear audit trail: tool version control, monthly summaries, action logs and evidence of completion (updated guidance, supervision notes, practice observations).
Operational example 2: co-produced complaints learning that improves trust and reduces repeat issues
Context
The service receives complaints about staff attitude, communication and inconsistent rules. Responses are polite and timely, but complainants feel nothing changes. Staff feel blamed and become defensive, making learning harder.
Support approach
The provider introduces a co-produced complaints learning process: a small panel reviews anonymised complaints themes quarterly and tests whether learning actions address the real issue. The panel does not investigate individuals — it focuses on system learning.
Day-to-day delivery detail
- Complaints are categorised into experience themes (respect, involvement, privacy, communication, boundaries, safety) as well as formal categories.
- Each quarter, managers present the top three themes with examples of what happened and what was done in response.
- The panel challenges whether responses address root causes (e.g., inconsistent staff messages due to weak handovers; rigid rules due to unclear individual plans).
- Actions are converted into practical changes: updated handover templates, clearer “how we support choice safely” guidance, supervision focus on tone and boundaries, and spot-check observations.
How effectiveness is evidenced
Evidence includes reductions in repeat complaints about the same theme, improved satisfaction with complaint handling (measured via a short follow-up question set), and governance minutes showing actions, owners and completion. Where learning relates to safeguarding or restrictive practice, the provider evidences that the response strengthened risk management (not relaxed it).
Operational example 3: co-produced incident learning that reduces escalation and restrictive practice
Context
Incident reviews focus on chronology and staff actions, but miss the person’s experience: what felt threatening, what increased distress, and what could have helped earlier. As a result, action plans become generic (“remind staff of policy”) and do not change practice.
Support approach
The provider redesigns incident learning to include a lived experience-informed review template. The template requires analysis of early warning signs, relational triggers, choice points, and opportunities to reduce restriction.
Day-to-day delivery detail
- After significant incidents, managers complete a structured review including “experience factors” (what the person may have perceived) alongside clinical/risk factors.
- The service uses post-incident debriefs that offer people a supported way to say what helped and what didn’t, with options for written, verbal or advocate-supported feedback.
- Learning actions are specific and observable: a change to crisis response scripts, a new handover prompt about triggers, a practice refresher on de-escalation micro-skills.
- Managers complete follow-up observations to confirm whether changes are happening on shifts, not just recorded.
How effectiveness is evidenced
Evidence includes incident trend analysis (frequency, duration, severity), reduced use of restrictive interventions, improved documentation of de-escalation attempts, and direct feedback indicating improved relational safety. The provider also maintains an action log linking incident themes to training updates and supervision topics, demonstrating a closed learning loop.
Commissioner expectation
Commissioner expectation: Commissioners expect providers to evidence quality assurance arrangements that are robust, routine and outcome-focused — not only compliance-based. Where co-production is claimed, commissioners will often look for: clear structures for capturing lived experience, evidence of action taken, and proof that changes improved outcomes (engagement, stability, reduced escalation, fewer complaints, reduced avoidable admissions).
Co-produced assurance meets this expectation when the provider can show governance discipline: defined forums, repeat cycles, themed reporting, action tracking, and measurable improvement over time.
Regulator / Inspector expectation (e.g., CQC)
Regulator / Inspector expectation (e.g., CQC): Inspectors will test whether the service listens, learns and improves — and whether people are involved in decisions about their support. They will look for evidence that feedback and complaints lead to real change, that incidents result in learning (not blame), and that governance provides oversight. Co-produced quality assurance supports this when it is documented, repeatable and reflected in what people and staff say during interviews and observations.
Governance controls and safeguards to keep co-produced assurance credible
To keep this safe and defensible, providers typically build the following controls:
- Confidentiality and data handling (anonymisation processes, clear boundaries, secure storage, role-based access).
- Contributor wellbeing (support, debrief options, choice about what material is reviewed, clear escalation routes if distress arises).
- Separation from HR/performance (assurance focuses on systems and themes, with separate routes for individual conduct issues where necessary).
- Action tracking discipline (owners, timescales, evidence of completion, impact review dates).
- Alignment with safeguarding so that learning strengthens risk management, rights-based practice and proportionality.
When these controls are in place, co-produced quality assurance becomes a practical improvement engine — and a strong source of inspection-ready evidence that quality is understood, monitored and improved in day-to-day practice.