Designing Digital Inclusion Assessments That Actually Work in Adult Social Care
Digital inclusion starts with accurate assessment. If providers misjudge someone’s access, skills or confidence, digital systems can quietly create harm, confusion or dependency. This is why digital inclusion should be assessed with the same rigour as communication needs, risk and daily living support, particularly where digital care planning is used for reviews, tasks and decision-making.
This article sets out a practical, defensible approach to digital inclusion assessment that can be embedded into everyday support planning and governance, rather than treated as a one-off “IT check”.
What a meaningful digital inclusion assessment covers
Providers often assess “does the person have a smartphone?” and miss everything that matters. A robust assessment explores function and risk across six practical domains:
- Access: device availability, reliable internet, charging, data affordability
- Skills: basic navigation, passwords, updates, accessibility settings
- Confidence: fear of “breaking” devices, anxiety, prior negative experiences
- Safety: scams, exploitation, coercion, privacy, consent to sharing
- Support: who helps, boundaries, dependency risks, consistency
- Purpose: what digital access is for (health, benefits, social contact, care planning)
Good assessments also identify what should not be digitised for someone, or what must be supported face-to-face to remain safe and inclusive.
How to embed assessment into routine practice
Digital inclusion assessment works best when integrated into:
- initial assessment and onboarding
- care plan reviews and outcomes reviews
- risk assessment and safeguarding planning
- communication passports and accessibility profiles
- incident learning (e.g., scams, missed appointments, confusion)
The goal is to make digital inclusion a living part of the support plan, reviewed as needs, risks and technology change.
Operational example 1: Access barriers causing missed health care
Context: A physical disability service supported a person who relied on NHS app-based appointment notifications. They frequently missed appointments and were labelled “non-compliant”.
Support approach: The provider completed a digital inclusion assessment and identified that the person had a phone but could not consistently charge it due to reduced hand function and a poorly positioned socket. They also struggled with multi-factor authentication.
Day-to-day delivery detail: Staff worked with OT to install a charging dock at wheelchair height and introduced a predictable “device check” routine at the start of each evening support call. The care plan included a prompt to confirm upcoming appointments weekly and to support login using an agreed method that protected confidentiality. A paper backup summary of key appointments was placed in a visible location, reviewed during visits.
How effectiveness or change is evidenced: Appointment attendance improved over eight weeks, with fewer missed clinical contacts. The provider evidenced this via care records, health liaison logs and a reduction in “missed appointment” incident notes.
Operational example 2: Digital confidence and coercion risk
Context: In a supported living setting, a person wanted online banking to improve independence. Previous attempts led to family conflict and pressure to share details.
Support approach: The digital inclusion assessment explicitly explored coercion, boundaries and supported decision-making. The aim was safe inclusion, not digital independence at any cost.
Day-to-day delivery detail: The provider agreed a supported routine: the person used online banking during scheduled support time, on their own device, with staff support limited to navigation and “talking through” choices rather than touching passwords. The plan set clear rules on who could request financial information. Any boundary challenges were logged and reviewed via safeguarding supervision. Staff recorded the person’s expressed preferences and how they were supported to make choices without undue influence.
How effectiveness or change is evidenced: The person managed routine transactions with reduced anxiety. Records showed consistent boundary enforcement and a clear audit trail demonstrating protection from exploitation without imposing blanket restrictions.
Operational example 3: Digital care planning access and inclusion
Context: A provider rolled out a digital portal allowing people and families to view care plans and contribute to reviews. Several people did not engage, and their voice was missing in reviews.
Support approach: The provider used digital inclusion assessments to determine who could meaningfully access the portal and what alternative inclusion methods were required.
Day-to-day delivery detail: For people with limited access, staff created “review-ready” summaries in accessible formats (easy read, large print, audio summary) and scheduled pre-review conversations to capture wishes and outcomes. Staff then uploaded the person’s views into the portal as their contribution, clearly labelled as “recorded with the person on [date]”. Managers checked a sample of reviews monthly to confirm the person’s voice was present regardless of portal access.
How effectiveness or change is evidenced: Audits showed improved involvement documentation and fewer review delays linked to digital access. Feedback from people supported indicated clearer understanding of plans and decisions.
Commissioner expectation: Evidence-based inclusion and reduced DNAs
Commissioner expectation: Commissioners increasingly expect providers to show that digital approaches reduce missed contacts, improve engagement and do not create inequity. Digital inclusion assessment should produce evidence that informs support planning, not a tick-box statement.
In contract assurance, providers can evidence this through:
- completed assessments with clear actions
- review cycles showing digital needs changing over time
- outcome measures (attendance, engagement, reduced isolation)
Regulator expectation: Inclusive communication and safe practice
Regulator / Inspector expectation: Inspectors expect providers to make information accessible and to demonstrate that systems support involvement and safety. Digital tools must not replace person-centred communication or create hidden exclusion.
This is often tested through case sampling: whether records show adaptation, understanding checks and proportionate safeguards.
Governance and assurance mechanisms
To make digital inclusion assessments defensible, providers should implement:
- Quality audit: monthly sample checking that assessments lead to actions
- Supervision prompts: discussion of digital risks and support boundaries
- Safeguarding oversight: flagging scams, coercion or online exploitation themes
- Review discipline: reassessment after incidents, major life changes or tech changes
When digital inclusion is governed this way, it becomes a credible part of person-centred support rather than an optional “digital project”.
Latest from the knowledge hub
- How CQC Registration Applications Fail When Incident Management Is Not Clearly Defined or Evidenced
- How CQC Registration Applications Fail When Business Continuity Is Not Operationally Planned
- How CQC Registration Applications Fail When Safeguarding Systems Are Described but Not Operationally Tested
- How CQC Registration Applications Fail When Policies Exist but Are Not Operationally Usable