Accessible Information in Social Care: Practical Adjustments That Hold Up in Reviews and Audits
Accessible Information is not just about providing a leaflet in large font. It is about designing communication so the person can understand, take part and make decisions in the moments that matter—reviews, assessments, safeguarding conversations and MDT meetings. In practice, this is where services often fall short: adjustments exist “on paper” but are not delivered consistently. This article links the operational reality of Communication, Accessible Information & Total Communication to your Core Principles & Values, focusing on how to make adjustments routine, auditable and outcome-led.
What counts as Accessible Information in day-to-day delivery
Accessible Information means information is provided in a format and method that the person can understand, and that they are supported to communicate in a way that works for them. Operationally, you should be able to answer three questions for each person:
- Format: What format works (easy read, large print, audio, symbols, translations, BSL, objects of reference)?
- Method: How does the person best process information (paced speech, repetition, quiet space, single-choice prompts)?
- Support: Who supports understanding (keyworker, advocate, family member, interpreter), and what is the boundary for best interests?
Where Accessible Information breaks down in reviews and MDTs
Breakdown 1: “The meeting is accessible, but the paperwork isn’t”
People arrive to a review with no accessible agenda, no advance information, and no time to process. Decisions are then made quickly, and involvement is described as “person attended”. Fix this by creating an accessible pre-meeting pack and minimum notice standards.
Breakdown 2: “People are overwhelmed by pace and power imbalance”
Multi-agency meetings move fast, use acronyms, and involve professionals who control the agenda. Even confident people can disengage. Fix this through chairing standards: slower pace, check-backs, plain language, and explicit pauses for the person to respond.
Breakdown 3: “Adjustments are not recorded in a way that staff can follow”
Adjustments are written as broad statements (“needs clear communication”), which are not actionable. Fix this by recording concrete prompts and specific “do / don’t” guidance that can be observed and audited.
Operational Example 1: Making care plan reviews genuinely accessible
Context: A person with learning disability and anxiety attended reviews but rarely spoke. Records stated “engaged well”, yet family challenged decisions, saying the person did not understand what had been agreed.
Support approach: The service introduced an accessible review process: pre-brief, accessible agenda, structured involvement prompts, and post-meeting confirmation.
Day-to-day delivery detail:
- The keyworker created a one-page agenda using symbols and simple headings (home, health, activities, goals, risks).
- Two days before the meeting, staff did a 20-minute pre-brief using choice boards and “what’s working / what’s hard” prompts.
- During the review, the chair paused after each agenda item and used a consistent “choice check” method (two options maximum, with visual prompts).
- After the meeting, staff completed an accessible “what we decided” sheet with the person, checking understanding and recording their responses.
How effectiveness/change is evidenced: The provider tracked complaints/queries after reviews, the number of actions completed on time, and whether the person could describe agreed goals using their preferred method. Family feedback improved, and review records became clearer, reducing dispute risk.
Operational Example 2: Accessible safeguarding conversations and information sharing
Context: A safeguarding concern involved a person with autism who found direct questioning distressing. Previous attempts to gather information escalated anxiety, reducing disclosure and creating unreliable accounts.
Support approach: The service applied an accessible safeguarding method: reduced sensory load, structured questioning, and clear boundaries about confidentiality and information sharing.
Day-to-day delivery detail:
- Conversations took place in a low-stimulation room at a planned time, with one trusted staff member present.
- Staff used written prompts and scaling cards (e.g., “0–5” worry scale) rather than rapid verbal questioning.
- Information sharing was explained using a simple three-box model: “what stays private”, “what we must share to keep you safe”, “who we share with”.
- Staff recorded the person’s responses in their own words and documented what adjustments were used so other professionals could mirror the approach.
How effectiveness/change is evidenced: The provider monitored distress markers during conversations, completeness/clarity of accounts, and whether subsequent safeguarding actions matched the person’s stated wishes and outcomes. Incident review showed fewer escalations and better quality evidence for safeguarding processes.
Operational Example 3: Accessible MDT communication in complex health integration
Context: A person with physical disability had multiple professionals involved (district nursing, OT, GP, homecare). They frequently agreed to plans in meetings but later disengaged, leading to missed appointments and delayed equipment provision.
Support approach: The provider created an accessible MDT pathway: single named coordinator, plain-language plan summaries, and a “confirm and schedule” method before closing meetings.
Day-to-day delivery detail:
- A named coordinator produced a one-page “MDT action plan” in plain English with dates, responsibilities and contact routes.
- Before the meeting ended, the coordinator checked each action using a structured prompt: what, who, when, how the person will be supported to follow through.
- The person chose their preferred reminders (text, phone call, printed schedule), and staff recorded how reminders must be delivered.
- After each MDT, the coordinator completed a short follow-up call to confirm understanding and identify barriers early.
How effectiveness/change is evidenced: The service tracked appointment attendance, time-to-equipment provision, and reduction in “did not attend” events. Records showed clearer accountability and better outcomes, supporting commissioner confidence in integrated working.
Commissioner expectation: adjustments embedded into pathways and contract assurance
Commissioner expectation: Commissioners expect accessible communication to be systematic, not dependent on individual staff. They will look for evidence that adjustments are built into pathways (reviews, safeguarding, MDT working) and that providers can demonstrate impact—reduced complaints, better engagement, improved outcomes and clearer decision-making records. They also expect providers to respond quickly when adjustments are not working, using review loops rather than waiting for failures to accumulate.
Regulator / Inspector expectation (CQC): involvement, consent and rights evidenced in records
Regulator / Inspector expectation (CQC): Inspectors will test whether people are involved in decisions and whether information is accessible. They will look at care records to see if adjustments are specific and consistently applied, particularly where capacity, consent, restrictive practice, or safeguarding is relevant. Weak practice shows generic wording and inconsistent delivery; strong practice shows clear prompts, evidence of understanding checks, and learning when communication barriers contribute to risk.
Governance mechanisms that make Accessible Information auditable
1) “Reasonable adjustment” standards and triggers
Set internal standards: what must happen before a review, what the accessible pack includes, and what triggers escalation (e.g., repeated non-attendance, repeated distress, repeated disagreement after meetings).
2) A recording template that forces clarity
Use a short template: preferred format, preferred method, understanding check used, what the person communicated, and what support was required. This prevents vague statements that cannot be audited.
3) QA checks linked to safeguarding and complaints
When there is a complaint about “not being listened to”, or a safeguarding process where the person’s wishes are unclear, include “communication accessibility” in root cause analysis and learning reviews.
4) Workforce competence and supervision focus
Train staff in practical adjustments (pacing, simplifying choices, visual supports, chairing prompts). Supervision should test understanding: “Tell me how you made last week’s review accessible, and how you evidenced it.”
Implementation checklist for managers and tender teams
- Define accessible information needs in actionable terms (not generic statements)
- Build accessible pre-brief and post-confirmation into review processes
- Set MDT chairing and pacing standards that protect involvement
- Audit through record reviews, spot checks and complaint learning
- Evidence impact using operational measures (attendance, disputes, outcomes, distress)