ICT Authority
ICTA Assessment Digital Readiness & E-Services
Menu

Digital Readiness Assessment

A data-driven diagnosis of Kenya's public sector digital maturity, infrastructure, and service delivery capabilities.

Digital Maturity Index (DMI)

The consolidated DMI is computed from four pillars: TRI (25%), ESMI (35%), RMI (25%), and GCI (15%). The average DMI across assessed MDAs is 2.1 / 5.0.

80
MDAs Assessed
Coverage includes a broad cross-section of public institutions.
2.1
Average DMI
Early-to-mid maturity: patchy progress with clear constraints.
1.5
Lowest Pillar (RMI)
Records remains the biggest drag on end-to-end digital delivery.
How DMI is used
Benchmark
Compare readiness across MDAs and identify outliers.
Prioritise
Target interventions where constraints block service outcomes.
Track
Measure improvement per pillar over time, not just projects delivered.

DMI Levels (1–5)

Distribution of MDAs by DMI level (approx. counts based on reported percentages).

Counts (approx.) 0–5 scale
Level guide
  • Level 1: Mostly manual operations; limited automation or online services.
  • Level 2: Some digitisation; isolated systems; low integration.
  • Level 3: Operational digitisation with partial integration and measurable service uptake.
  • Level 4: Integrated platforms; consistent governance; service performance managed.
  • Level 5: Optimised, data-driven delivery; continuous improvement; strong citizen experience.
What the distribution implies
  • Most MDAs cluster in Levels 1–3, indicating “foundation-first” needs before advanced transformation.
  • Moving from Level 2 → 3 typically requires integration, process redesign, and better records management.
  • Level 4 capability is rare and should be used as a reference model for standards and patterns.

Cluster Distribution

Maturity clusters summarise patterns of readiness and capability across MDAs.

Leaders Emerging Basic Inactive/Low
Leaders (4%) — strong governance, infrastructure maturity, and higher completion services.
Emerging (20%) — improving digital footprints, partial automation and visible e-service uptake.
Basic (61%) — fragmented systems; low integration; paper-heavy back offices.
Digitally Inactive / Low (15%) — minimal digitisation and limited online service availability.
How to use clusters
  • Leaders: reference architectures, mentoring, shared services patterns.
  • Emerging: accelerate integration, identity/payment patterns, and service redesign.
  • Basic: focus on core foundations (connectivity, records, governance cadence).
  • Inactive/Low: minimal viable digitisation and quick wins to establish momentum.

Average Pillar Scores

Pillar averages used in the consolidated DMI.

Infrastructure (TRI)

Connectivity and platforms are progressing, but consistency varies.

E-Services (ESMI)

Online availability exists, but end-to-end completion remains uneven.

Records (RMI)

The biggest bottleneck: workflow and evidence still paper-heavy.

Pillar weights
PillarWeight
TRI25%
ESMI35%
RMI25%
GCI15%

Recommended Actions

Shift the median MDA from Level 2 → 3 by fixing the “blocking” foundations: records, integration, and delivery governance.

1) Fix records bottlenecks

Digitise workflows and retention schedules so services can be completed end-to-end without paper fallbacks.

  • Standard file plans + metadata
  • Digitised approvals
  • Retention & disposal rules
2) Standardise integration patterns

Reduce fragmentation using shared standards for identity, payments, notifications, and data exchange.

  • API gateway + catalogue
  • Common auth/SSO
  • Data sharing agreements
3) Run delivery governance

Establish a delivery cadence (steering, KPIs, roadmap) so investments are repeatable and measurable.

  • Roadmap + funded pipeline
  • Service performance KPIs
  • Security & compliance checks
Fast win: “Level 2 → 3” playbook
Service selection
  • Pick 3–5 high-volume services
  • Map end-to-end journey
  • Remove paper fallback steps
Core enablers
  • Records workflow + retention
  • Payments/ID/notifications
  • Integration via APIs
Operate & measure
  • Uptime & completion rates
  • Transaction turnaround time
  • User satisfaction feedback loop

Note: Counts and percentages reflect reported ranges and synthesis; treat as directional indicators for planning and prioritisation.