Dementia Quality Assurance in Practice: Audits, Learning Cycles and Evidence for CQC and Commissioners
Many dementia services can describe their values, but struggle to evidence quality in a way that stands up to tender evaluation or inspection. The gap is rarely about intent; it is about whether the service has a repeatable quality assurance cycle that converts daily practice into reliable evidence: audits that measure the right things, learning processes that change behaviour, and governance that closes actions rather than creating more meetings.
This article forms part of Dementia – Quality, Safety & Governance and links to Dementia – Service Models & Care Pathways, because quality assurance must reflect the pathway you run (memory assessment follow-on, home care, supported living, care home, step-down, or integrated community models) and the risks those models create.
Quality assurance in dementia: the questions you must be able to answer
Before choosing audits, be clear about the core questions commissioners and inspectors will test:
- Do staff understand the person’s needs and deliver the plan consistently?
- Are risks identified early and escalated appropriately (health deterioration, safeguarding, self-neglect, distress)?
- Are decisions lawful and recorded (capacity, consent, best interests, restrictive practice)?
- Do learning processes change practice (and can you prove it)?
In dementia services, “quality” is often revealed in small operational moments: how staff respond to refusal, how they document fluctuating capacity, how they de-escalate distress, and whether families feel heard and informed.
Commissioner expectation: measurable assurance, not narrative
Commissioner expectation: providers should be able to show objective assurance that quality is maintained and improving. In tender and contract management contexts, commissioners typically expect to see:
- A documented QA framework (audit plan, frequency, sampling method, reporting route).
- KPIs that reflect dementia reality (not just generic metrics).
- Evidence of closed-loop improvement (issue → action → re-audit → sustained change).
- Clear escalation and safeguarding pathways with recorded outcomes.
Regulator / CQC expectation: triangulated evidence and learning in action
Regulator / Inspector expectation (CQC): services should be able to demonstrate that oversight is effective and person-centred, using triangulation. Inspectors often look for:
- Consistency between care plans, daily notes and observed practice.
- Staff competency and confidence in dementia-specific approaches.
- Clear reasoning for restrictions and evidence of least restrictive practice.
- Responsive handling of incidents, complaints and safeguarding concerns.
Crucially, inspectors may ask “what changed as a result?” If the answer is vague (e.g., “we reminded staff”), governance will look weak. If you can show tracked actions, re-audit results and improved outcomes, governance looks credible.
Designing dementia-focused audits that actually matter
Many services audit what is easiest to measure rather than what drives risk and outcomes. Dementia-focused audits should include, at minimum:
- Care planning quality: individualised routines, communication approach, life story use, distress triggers, meaningful activity, hydration/nutrition support.
- MCA / consent documentation: decision-specific capacity notes, best interests records where needed, family involvement and advocacy routes.
- Restrictive practice: restriction register, rationale, review dates, and evidence of least restrictive alternatives tried.
- Medicines: MAR accuracy, PRN rationale and response, refusal pathway documentation, high-risk medicine checks.
- Safeguarding: recognition, reporting, outcomes, and learning themes (including Making Safeguarding Personal).
Where possible, audits should be short and frequent (small samples often) rather than large and occasional. This reduces drift and makes learning faster.
Operational Example 1: Turning complaints into measurable service improvement
Context: A care home received repeated family complaints that their relative “was left in their room” and staff “didn’t know what calmed them.” Staff felt the complaints were subjective and hard to evidence.
Support approach: The manager created a focused audit and improvement cycle around meaningful activity and engagement, linked to individual routines and life story information.
Day-to-day delivery detail:
- Audit sample of 10 daily notes per week: did notes reflect meaningful engagement, not just personal care tasks?
- Care plan refresh: each person had a “what matters today” section (preferred topics, sensory supports, routines, known triggers).
- Shift brief included “engagement intention” for each person (who will do what, when, and how).
- Family feedback captured after changes, with documented responses and actions.
How effectiveness is evidenced: Engagement documentation increased, complaints reduced, and relatives reported improved responsiveness. Re-audit showed higher completion of personalised routine prompts and clearer staff notes linking activity to mood.
Operational Example 2: Audit + supervision to fix inconsistent capacity recording
Context: A domiciliary care service supporting people with dementia had inconsistent recording of capacity and consent, especially around medication prompts, finances, and accepting care. This created safeguarding and legal risk.
Support approach: The provider combined targeted audits with supervision and competency checks, focusing on decision-specific capacity and escalation routes.
Day-to-day delivery detail:
- Weekly sample audit of notes where refusals occurred: was decision-specific capacity considered and recorded?
- Supervision used real case scenarios: staff practised documenting capacity concerns and escalation steps.
- Best interests pathway clarified: when to involve family, when to request professional input, when to escalate safeguarding.
- Spot checks during visits: senior staff observed and coached how staff gained consent and responded to refusal.
How effectiveness is evidenced: Audit results showed improved documentation quality and earlier escalation when risks emerged. Safeguarding alerts reduced for “avoidable drift” issues (e.g., unmanaged self-neglect), and case reviews demonstrated clearer, lawful decision-making.
Operational Example 3: Incident learning that reduces repeated distress episodes
Context: A supported living service saw repeated incidents of distress and aggression during personal care. Incident forms were completed, but patterns were not acted on, and staff confidence fell.
Support approach: The service implemented a short-cycle incident learning process focused on triggers, de-escalation, and consistent practice.
Day-to-day delivery detail:
- Incident review within 72 hours with the key worker and shift lead: identify triggers, early warning signs, and effective approaches.
- Care plan updated immediately (not at next monthly review) with a “what works” de-escalation script.
- Practice huddles: staff rehearsed the approach and agreed consistent language and pacing during care.
- Follow-up observation: senior staff observed one care interaction to check the revised approach was being used.
How effectiveness is evidenced: Distress incidents reduced and became shorter; staff recorded more preventive actions; and re-audit showed updated plans were being followed. The service could demonstrate a clear line from incident learning to changed practice and improved outcomes.
Choosing dementia KPIs that are defensible
For dementia services, KPIs should capture both safety and lived experience. Consider a balanced set such as:
- Falls (rate, repeat falls, time/location themes, post-fall review completion).
- Medication incidents (type, severity, repeat themes, corrective actions completed).
- Safeguarding (alerts, outcomes, time to report, learning themes, action completion).
- Distress incidents (frequency, duration, triggers, restrictive practice involvement).
- Quality of life indicators (engagement recorded, meaningful activity plans in place, family feedback themes).
KPIs should not be presented as “numbers only.” They must be linked to narrative explanation: what you did, why you did it, and what changed.
Making the improvement cycle visible in tenders and inspections
The simplest way to evidence QA is to show the loop:
- Plan: audit schedule and standards (“what good looks like”).
- Do: audits completed and supervision / coaching delivered.
- Check: results, themes, and risk escalation where needed.
- Act: actions tracked to completion, then re-audited for sustainability.
When you can demonstrate this loop using real examples, QA stops being abstract. It becomes a practical management system that commissioners trust and inspectors recognise.