Using Lived Experience Safely in Mental Health Quality Improvement
Gathering lived experience feedback is now standard practice, but commissioners and inspectors increasingly expect providers to show how that feedback is translated into structured quality improvement. That means moving beyond surveys and forums to a clear improvement cycle: insight, decision, action, review and evidence of impact. This article explains how providers use lived experience safely and auditably within co-production and lived experience, while remaining aligned to service models and pathways and the realities of risk, capacity and clinical accountability.
Why lived experience must connect to QI governance
If lived experience insights sit outside governance, they are easy to ignore, lose, or treat as “soft intelligence”. Strong providers treat lived experience as a formal quality input alongside incidents, complaints, audits and performance data. This means:
- categorising feedback into themes linked to quality domains
- identifying whether issues relate to pathway design, staff practice or resources
- using formal decision-making routes to agree actions
- tracking implementation and impact
Operational example 1: Turning repeated feedback into measurable service change
Context: People using a supported accommodation mental health service reported that staff responses felt inconsistent and that some staff were “too strict” while others were “too relaxed”. Complaints and safeguarding alerts showed similar themes: boundaries, communication and escalation.
Support approach: The provider triangulated lived experience feedback with incident themes and supervision audits. A QI project was created focused on consistency of relational practice, boundaries and least restrictive approaches.
Day-to-day delivery: The service introduced a “practice expectations” framework co-designed with lived experience contributors, describing what good looks like (how staff explain decisions, how choices are offered, how de-escalation is attempted). This framework was embedded into shift handovers, supervision templates and spot checks. Senior staff completed fortnightly observations against the framework and fed back to teams in reflective sessions.
Evidence of effectiveness: The provider tracked reduced complaints about staff behaviour, fewer incidents escalating to police involvement, and improved experience feedback about being treated with dignity and respect. Evidence was available through the QI action log, observation records and outcome data.
Commissioner expectation: QI is systematic and evidenced
Commissioner expectation: Commissioners expect providers to evidence systematic improvement, not just activity. Where lived experience insights identify problems (e.g., poor continuity, unsafe practice, unclear access), commissioners look for clear action plans, accountable owners, and measurable improvement over time.
Regulator expectation: learning is embedded into day-to-day practice
Regulator expectation (CQC): Inspectors look for how learning is embedded, not just recorded. They commonly test whether changes have translated into consistent staff behaviour, safer practice, and improved experience for people supported.
Using lived experience safely: trauma, consent and safeguarding
Lived experience involvement must be safeguarded. Providers should avoid re-traumatisation and ensure participation does not create risk. Practical safeguards include:
- clear consent for any use of real experiences, anonymised where needed
- emotional support arrangements and debriefing
- boundary setting for roles (not replacing clinical teams)
- clear routes to raise concerns or complaints about involvement processes
Operational example 2: Co-producing post-incident learning and debrief
Context: Following crisis incidents, debriefs were staff-only and focused on procedural compliance. People supported reported feeling blamed, and patterns of recurrence were not reducing.
Support approach: The provider introduced a co-produced debrief model that included the person (when appropriate) and focused on triggers, communication and what could have been done differently earlier.
Day-to-day delivery: Debriefs were held within 72 hours, supported by a simple structure: what happened, what helped, what made it worse, what should we do next time. Where the person chose not to attend, their views were captured through an advocate or follow-up conversation. Learning points were recorded as themes and reviewed monthly under governance.
Evidence of effectiveness: The provider recorded a reduction in repeated crises for a cohort of individuals, fewer restrictive interventions, and improved reported feelings of being listened to. The debrief model also strengthened safeguarding assurance by showing reflective learning.
Restrictive practice and least restrictive care
Where restrictive practice occurs, lived experience insight is particularly valuable because it exposes how interventions are perceived, what triggers escalation, and what alternatives are credible. However, governance must ensure that lived experience involvement does not compromise confidentiality or create blame culture. Good practice focuses on themes and system learning.
Operational example 3: Improving medication support and consent conversations
Context: Feedback indicated people felt medication was “done to them” rather than discussed with them. Some individuals reported inconsistent consent conversations and unclear explanations about side effects.
Support approach: A QI project was formed with lived experience contributors, nurses and care coordinators to improve consent and shared decision-making in medication support.
Day-to-day delivery: The service introduced a standardised consent conversation prompt (capacity considered, options explained, questions invited, side effects discussed, and a “what matters to you” check). Staff were trained to document the conversation clearly and to refer concerns to prescribing clinicians. Spot checks reviewed a sample of records monthly and included a lived experience reviewer to assess whether language was respectful and understandable.
Evidence of effectiveness: The provider recorded improved feedback about involvement in decisions, reduced complaints about medication support, and stronger documentation for inspection. This also strengthened risk management by ensuring concerns were escalated appropriately.
Assurance: how providers show impact
The strongest assurance combines qualitative and quantitative evidence. Providers can demonstrate impact through:
- QI action logs showing changes linked to lived experience insight
- complaints and compliments trends
- incident and restrictive practice trends
- staff supervision and observation evidence
- patient-reported outcome/experience measures over time
When lived experience is treated as a core quality input, it strengthens commissioning confidence and inspection readiness because it shows services are learning, adapting and improving in response to real-world experience.