Innovation Governance: Making New Approaches Safe, Scalable and Inspection-Ready
Innovation in adult social care often starts with a promising idea, a motivated team and a short-term pilot. Where it fails is governance: unclear accountability, inconsistent delivery, weak risk controls and limited evidence of learning. Commissioners and regulators do not require providers to avoid innovation; they require them to manage it safely. This article forms part of Innovation, Added Value & System-Wide Impact and supports robust assurance approaches aligned to Social Value expectations.
Good innovation governance ensures that new approaches improve outcomes without creating unintended harm. It also provides the audit trail that commissioners and inspectors rely on to judge whether improvement is controlled, accountable and sustainable.
What innovation governance must achieve
Innovation governance should ensure:
- Safety: safeguarding, clinical and operational risks are identified and controlled.
- Consistency: staff deliver the new approach reliably and can explain it.
- Evidence: impact is measured, and learning is documented and acted on.
- Scalability: the approach can be expanded without loss of quality.
Governance should be proportionate. A small practice change may require light-touch controls; a new pathway, technology or delivery model requires structured oversight.
Core building blocks of governance
Providers typically need four governance elements:
- Accountability: a named lead, decision rights and escalation routes.
- Risk control: risk assessment, mitigation actions and review frequency.
- Assurance: audits, observations and quality checks that test real practice.
- Learning: feedback loops through supervision, incident reviews and improvement cycles.
These controls should be documented and embedded into existing quality governance rather than kept in a separate “innovation folder” that does not influence day-to-day delivery.
Operational example 1: Governing a new incident debrief model
Context: A service introduced a new incident debrief process to improve learning after restrictive interventions and reduce recurrence.
Support approach: The provider designed a debrief template, set timeframes (within 48 hours for staff debrief; within 7 days for governance review), and defined escalation triggers.
Day-to-day delivery detail: Shift leaders completed immediate debriefs, managers reviewed themes weekly, and the PBS lead attended monthly governance meetings to analyse patterns. Where themes suggested environmental triggers, action plans were created and followed up via audit.
How effectiveness was evidenced: Governance minutes and dashboards demonstrated reduced repeat incidents and clearer learning actions. Inspectors could see both compliance with the process and evidence that learning was implemented.
Operational example 2: Scaling a proactive engagement approach
Context: A provider trialled a proactive engagement model to reduce isolation and improve wellbeing for people supported in shared living.
Support approach: The model required staff to plan meaningful occupation daily and to record engagement quality, not just attendance.
Day-to-day delivery detail: Managers ran weekly practice observations, used supervision to reinforce expectations, and updated the rota to protect “activity lead” time. The provider introduced short monthly feedback sessions with people supported and families to triangulate experience.
How effectiveness was evidenced: Engagement quality improved and incidents linked to boredom reduced. The provider could show how staffing deployment, supervision and audits supported consistent delivery during scale-up.
Operational example 3: Implementing technology with safeguarding controls
Context: A provider introduced digital care records and real-time prompts to reduce recording gaps and improve oversight.
Support approach: The provider mapped data protection, consent and access controls, and built an escalation process where prompts indicated risk (for example, repeated missed medication sign-offs).
Day-to-day delivery detail: Team leaders checked completion daily, managers ran weekly exception reports, and training included scenario-based practice. If patterns emerged, managers used supervision and competency checks rather than relying on reminders alone.
How effectiveness was evidenced: Recording quality improved and oversight became stronger. Governance evidence showed not just system adoption but how leaders used data to improve safety and practice.
Commissioner expectation
Commissioners expect innovation to be governed with clear accountability, measurable outcomes and transparent reporting. They want confidence that the provider can scale improvement without diluting quality, and that risks are actively managed.
Regulator expectation
The CQC expects providers to demonstrate safe systems for managing change, including safeguarding oversight, incident learning, staff competency and effective leadership. Inspectors will look for evidence that innovation is embedded in day-to-day practice and subject to ongoing monitoring.
How to remain inspection-ready during innovation
Inspection readiness is protected when providers maintain clear audit trails. Practical steps include:
- Documenting the rationale for the innovation and expected outcomes
- Maintaining risk assessments and showing how mitigations are reviewed
- Recording staff competency checks, supervision and practice observations
- Keeping governance minutes that show decisions, actions and follow-up
Crucially, staff should be able to explain what has changed, why it changed, what safe practice looks like, and how concerns are escalated.
Scaling decisions: when to expand, pause or stop
Governance should define criteria for scale-up. For example:
- Evidence of consistent delivery across shifts
- No increase in safeguarding concerns or serious incidents
- Positive outcome trends sustained over a defined period
- Clear learning documented and embedded
Equally, governance should allow leaders to pause or stop innovations that create risk or do not produce meaningful benefit, and to document those decisions transparently.