From Requires Improvement to Good: Building a Credible CQC Improvement Plan
Moving from Requires Improvement to Good is rarely achieved through generic action lists or broad promises to “strengthen governance”. CQC inspectors usually want to see whether a provider has understood the specific reasons performance fell short, translated those findings into practical action and built an improvement process that changes day-to-day care rather than simply producing paperwork. Providers reviewing wider CQC improvement and recovery guidance alongside the practical framework within the CQC quality statements should therefore be able to evidence recovery as a structured, accountable and operational process. The strongest improvement plans do not just describe what will happen. They show who is responsible, how progress will be measured, what good looks like in practice and how leaders know improvement is holding.
A consistent reference point for service improvement is the CQC knowledge hub for governance, registration and inspection readiness.
Why weak improvement plans fail in inspection
Many plans fail because they are too abstract. They repeat inspection language without converting it into daily operational change. For example, a provider may state that documentation will improve, but not explain which records were weak, what standard is now expected, how staff will be supported or how leaders will test whether quality has actually improved. In that situation, the plan sounds reassuring but gives inspectors little confidence that the service has real grip.
Another common weakness is overproduction. Services sometimes create long plans with dozens of actions, each given equal importance. This can make recovery appear busy while obscuring what matters most. CQC is often more reassured by plans that prioritise the most important risks clearly, connect actions to evidence and show disciplined follow-through over time.
What a credible CQC improvement plan looks like
A credible plan usually begins with accurate diagnosis. Leaders need to show they understand not only what the inspection found, but what sat behind it. Was the problem inconsistent staff practice, weak oversight, poor escalation, fragile induction, excessive use of temporary staff or drift in documentation standards? Until that question is answered properly, actions often remain superficial.
Strong plans then translate findings into measurable actions. Each action should have a named owner, a timescale, a method of review and a clear definition of success. Importantly, improvement plans should also distinguish between immediate containment and longer-term sustainability. Some issues require urgent correction, but Good is usually only secured when services can show that improvement has become embedded rather than temporarily imposed.
Operational example 1: residential service rebuilds medication governance
Context: A residential home received negative inspection findings linked to medication recording, inconsistent PRN rationale and weak senior oversight of evening rounds. Leaders initially considered rewriting the medicines policy, but quickly recognised that the problem was operational rather than documentary.
Support approach: The home’s improvement plan separated immediate risk controls from sustained governance change. Immediate actions included observed rounds, audit frequency increases and clear sign-off limits for newer senior carers. Longer-term actions included competency reassessment, protected handover time and monthly governance review of trends.
Day-to-day delivery detail: Staff were retrained using real examples from the home’s own records. Managers observed practice on the most pressured shifts, not only during quieter daytime periods. Follow-up audits checked whether staff were documenting refusal, PRN rationale and timing more consistently. Governance meetings then reviewed whether the improvement held or whether the same errors were reappearing.
How effectiveness was evidenced: The service could show improved audit scores, fewer recording errors, clearer staff explanations of safe medicines practice and stronger leadership oversight. The plan was credible because it linked specific findings to real operational change.
Operational example 2: domiciliary care provider strengthens escalation and record quality
Context: A home care provider was criticised because staff were kind and reliable but too many early warning signs such as confusion, appetite reduction and skin concerns were being recorded vaguely or escalated late.
Support approach: The improvement plan focused on meaningful recording and escalation, not generic “documentation training”. Leaders identified where practice had drifted and introduced targeted supervision, revised note standards and manager review of high-risk packages.
Day-to-day delivery detail: Supervisors reviewed anonymised records with staff, comparing weak and strong examples. Managers made follow-up calls after complex visits to test whether staff understood when to escalate. Spot checks measured whether notes were becoming more clinically useful and whether office teams were responding more consistently once concerns were raised.
How effectiveness was evidenced: Escalations became timelier, daily notes improved and the provider could demonstrate that the plan changed how care was delivered, not just how forms looked.
Operational example 3: supported living service restores consistency in behaviour support
Context: A supported living service received inspection criticism because tenant distress was being managed inconsistently across shifts, leading to avoidable escalation and occasional over-restrictive responses.
Support approach: The improvement plan targeted consistency. Leaders reviewed incident patterns, staff practice and handover quality, then prioritised clearer support expectations, team leader oversight and reflective supervision linked to real scenarios.
Day-to-day delivery detail: Staff were observed during known pressure points such as community transitions and shared-space tension. Team meetings clarified what support should happen before distress escalated. Managers tracked whether tenants experienced fewer avoidable incidents and whether restrictive responses reduced over time rather than simply being defended as necessary.
How effectiveness was evidenced: Practice became more aligned, tenants experienced calmer support and leaders could evidence that recovery was improving lived experience, not just internal assurance documents.
Commissioner expectation
Commissioner expectation: Commissioners generally expect improvement plans to show clear grip, credible priorities and measurable progress. They are likely to look for named accountability, realistic timescales, evidence that immediate risks are controlled and proof that improvements are translating into safer, more reliable care. Confidence is stronger where the provider can show that recovery planning is rooted in operational reality rather than presentational optimism.
Regulator / Inspector expectation
Regulator / Inspector expectation: CQC inspectors usually expect improvement plans to be specific, evidence based and linked directly to the issues previously identified. They are likely to examine whether actions are measurable, whether leaders know what success looks like and whether improvements are sustained in practice. CQC is generally more reassured where recovery documentation is concise, disciplined and clearly connected to what staff and people using the service are experiencing day to day.
How to strengthen an improvement plan before re-inspection
Providers can improve their plans by testing each action against four questions. What exactly is changing. Who owns it. How will it be checked. What evidence will prove it is working. If any of those answers are vague, the plan is likely to remain weak. Providers should also check whether the plan distinguishes between urgent correction and deeper culture or governance change.
The strongest services use improvement plans as working management tools rather than static documents for external audiences. They update progress honestly, remove actions that are complete, escalate those that are slipping and keep the focus on the issues most likely to affect care quality. When providers can evidence that kind of disciplined recovery planning, inspectors are much more likely to conclude that the journey from Requires Improvement to Good is credible and well led.
Latest from the knowledge hub
- How CQC Registration Applications Fail When Induction Systems Are Promised but Not Operationally Ready
- Why CQC Registration Fails When Safeguarding Is Defined but Not Operationally Embedded
- How CQC Registration Applications Fail When Complaints Systems Are Written but Not Operationally Ready
- CQC Registration Readiness: Demonstrating Effective Risk Management Before Approval