Autism adult services: auditing restrictive practices and evidencing least-restrictive care

Auditing restrictive practice is not about catching staff out. It is about creating assurance that restriction is lawful, proportionate, time-limited and actively reducing. In adult autism services, restriction often becomes embedded because no one can see the full picture and no one can prove change over time. This article explains how to audit practice within restrictive practices, DoLS, LPS and legal safeguards, and how auditing must connect to real service models and care pathways so that findings translate into operational improvement rather than compliance reporting.

What an audit of restrictive practice should achieve

Effective auditing should answer five questions that matter to commissioners, boards and inspectors:

  • Visibility: what restrictions are in place across people, services and shifts?
  • Lawfulness: is restriction justified, least restrictive, and authorised where deprivation exists?
  • Quality: are decisions evidenced and reviewed, or vague and drifting?
  • Reduction: are restrictions reducing over time with a clear plan?
  • Impact: does restriction reduction improve stability, outcomes and safeguarding?

Audits that only check policy presence or training completion are rarely sufficient. The most meaningful audits test real records and real decision-making.

Designing an audit that is proportionate and defensible

A practical restrictive practice audit typically uses a blend of:

  • Document sampling: care plans, risk assessments, incident logs, restrictive practice register entries, DoLS/LPS records and review notes.
  • Staff interviews: can staff explain why a restriction exists and when it will be reviewed?
  • Observation: are restrictions implemented consistently and in line with the plan?
  • Trend analysis: are restrictions reducing, stable or increasing over time?

Sampling should deliberately include high-risk contexts: people with deprivation authorisations, those with frequent incidents, and those with long-running restrictions.

Quality indicators: what to score and track

Audit scoring becomes useful when it is based on observable indicators. Common indicators include:

  • Restriction clearly described (not generic).
  • Specific risk rationale evidenced (not asserted).
  • Alternatives considered and recorded.
  • Review date present and honoured.
  • Reduction plan present with measurable steps.
  • DoLS/LPS alignment where deprivation applies.
  • Evidence that the person was involved using reasonable adjustments.

Trend tracking matters. A provider can be compliant in one audit but still fail if restrictions steadily increase over months without governance intervention.

Operational example 1: audit identifies “restriction by routine” on night shifts

Context: A provider’s restrictive practice register shows modest restriction. However, a targeted audit of night shift practice finds informal restrictions: bedroom doors discouraged from opening, no access to kitchen, and staff using “rules” to avoid disruption.

Support approach: The audit team triangulates evidence: staff interviews, night shift logs, and care plan alignment. The provider treats the findings as a governance issue rather than blaming individuals.

Day-to-day delivery detail: Care plans are revised to clarify night-time routines and permissible choices. Staff are trained in sensory-informed sleep support and de-escalation that does not rely on control. A manager introduces a night shift supervision focus: one restrictive decision discussed weekly, with evidence required.

How effectiveness is evidenced: Follow-up audit shows reduced informal restrictions, improved consistency between day and night plans, and no increase in incidents. Sleep quality indicators and distress incidents at night improve, demonstrating that reduced restriction can enhance stability.

Operational example 2: audit exposes DoLS/LPS misalignment with actual restrictions

Context: A person has an authorisation with conditions supporting community access and choice. The register indicates compliance, but incident logs show staff regularly preventing outings due to anxiety about risk.

Support approach: The audit compares: register entries, authorisation conditions, incident logs, and daily notes. It confirms that restriction is greater than documented and conditions are not being operationalised.

Day-to-day delivery detail: The provider introduces a “conditions-to-actions” mapping sheet used in handover and supervision. A revised risk enablement plan is implemented with staged outings, travel rehearsal, check-in points and clear intervention thresholds. Management oversight is increased for six weeks with weekly reviews.

How effectiveness is evidenced: Audit re-check shows conditions are being met, community access increases, and incident frequency reduces. The provider evidences learning: changes are embedded into templates and training.

Operational example 3: audit drives restriction reduction through measurable targets

Context: A service supports several autistic adults with long-running restrictions. The provider is unsure whether restrictions are reducing because there is no baseline measure.

Support approach: The audit establishes baseline metrics: number of restrictions per person, average duration of restrictions, and proportion with active reduction plans. The provider uses these metrics as a quality dashboard, not as a performance weapon.

Day-to-day delivery detail: Each restriction is assigned a reduction step for the next review period (e.g., reduce observation frequency; increase independent time; expand choice windows). Teams must evidence progress in monthly governance meetings. Where progress stalls, a senior review is triggered to test whether the restriction is still proportionate.

How effectiveness is evidenced: Over three months, dashboard data shows reductions in restriction count and duration, with stable safeguarding outcomes. The provider can evidence governance action and impact rather than relying on narrative claims.

Commissioner expectation

Commissioners will expect providers to evidence restrictive practice oversight through auditing and measurable reduction. They look for reliable records, DoLS/LPS alignment, trend reporting, and evidence that the provider can identify and correct drift. Commissioners also expect assurance that restriction is used to protect people, not to manage staffing or service pressure.

Regulator and inspector expectation (CQC)

CQC will expect providers to demonstrate least-restrictive care and rights-based governance. Inspectors will look for audit evidence, learning from findings, staff understanding, and a visible reduction culture. Where audits exist only “on paper”, CQC typically finds inconsistent practice, undocumented restrictions, and weak review discipline.

Governance and assurance: making audit findings matter

  • Board-level reporting on restriction trends and deprivation status.
  • Action tracking with deadlines and named owners from each audit.
  • Re-audit cycles for high-risk services or repeated findings.
  • Staff learning integrated into supervision and team meetings.
  • Safeguarding link so restriction-related harm indicators are monitored.

What good looks like

Good auditing produces evidence that restriction is visible, lawful and reducing. It creates a defensible audit trail for commissioners and inspectors and, most importantly, supports autistic adults to experience greater autonomy without increased risk.