The Business Case Everyone Approves and Few Deliver
A mid-tier African bank's transformation committee approves a digital banking programme. The business case shows a three-year payback period, $4.2 million in efficiency savings, and a 15 percent reduction in cost-to-income ratio. The board signs off. The programme begins.
Three years later, the cost-to-income ratio is higher than it was. The efficiency savings have not materialised. The technology is live — the mobile app launched, the core banking platform is running — but the organisation beneath it did not change. Staff are working around the new systems using the old processes. The $4.2 million in savings was real on a spreadsheet. It was never real in the bank.
I have seen this pattern repeat across eight transformation programmes I have been directly involved in across West and East Africa. The failure is not technical. The technology usually works. The failure is that the business case was built on vendor promises about what the technology would deliver — not on an honest assessment of what the organisation needed to do to capture that value.
These are the things that business cases in African banking digital transformation consistently get wrong.
What Gets Systematically Undercounted
Change Management Costs
The most reliable predictor of transformation ROI failure is a business case where change management is either absent or allocated 3–5 percent of total programme budget.
In a mid-tier African bank with 200–500 staff, effective change management for a core banking or digital channel transformation requires:
- A dedicated change management lead for the full programme duration (typically 24–36 months)
- A network of change champions embedded in each business line — not a communications exercise, but trained people who translate what the system change means for daily workflow
- A structured communications programme that starts before go-live and sustains through the 6–12 months of post-go-live stabilisation when staff reversion risk is highest
- A feedback loop that surfaces resistance early enough to address it before it becomes embedded workaround behaviour
The fully-loaded cost of this infrastructure — including the change team's time, the business line champions' diverted capacity, the communications materials, and the training development — typically runs 12–18 percent of total programme cost. A business case that allocates 4 percent is planning to skip most of it.
When change management is underfunded, the technology goes live on schedule and the behaviour change does not happen. The system is there. The old process runs next to it. You have the costs of both without the benefit of either.
Parallel Running Costs
Every transformation programme has a parallel running period — the window during which the new system is live but the old system is still running as a fallback. In African banking contexts, this period is routinely longer than business cases model.
The standard assumption I see in business cases is 3–6 months of parallel running. The experienced reality is 9–18 months. The reasons are specific to the African institutional context:
Regulatory acceptance timelines. Central banks in Nigeria, Ghana, Kenya, and Tanzania all require demonstration of new system stability before approving decommissioning of legacy infrastructure. That approval process — which involves regulatory inspection, audit sign-off, and often a period of supervised parallel operation — takes longer than most programme plans assume. A programme built on Bank of Ghana approval timelines from 2019 is likely to find 2026 requirements more demanding.
Customer migration friction. Moving customers from old product structures to new ones on a new system requires customer consent, communication, and in many cases, relationship management for business clients who resist any change to their account terms. The long tail of uncontacted or unresponsive customers keeps legacy systems running long past planned decommissioning.
Staff competency gaps. New core banking systems require staff to operate differently — not just use a new interface, but understand new data structures, new exception handling flows, new reconciliation processes. Until staff competency reaches a threshold where supervisors are confident in the new system's operational outputs, parallel running continues. Training timelines for 300 branch staff across 15 locations are consistently underestimated.
The cost of parallel running is not just the direct IT cost of running two systems. It is the management attention, the reconciliation overhead, the compliance reporting burden doubled across two data sources, and the opportunity cost of staff managing parallel processes instead of serving customers.
Staff Retraining Timelines
Business cases typically model training as a pre-go-live event: train staff in months 18–20, go live in month 21, begin extracting efficiency gains in month 22.
The actual trajectory is different.
The productivity dip immediately post-go-live is real, measurable, and regularly ignored in ROI projections. When staff move from a system they have operated for five to ten years to a new interface with new logic, productivity drops — typically 20–40 percent in the first 90 days. Transaction processing times increase. Error rates rise. Exception handling volumes spike. The same operations require more supervisor time.
This dip recovers. But the recovery timeline for a branch network across a mid-tier African bank — where digital literacy varies significantly by geography and where high staff turnover means the training cohort is constantly refreshed with untrained new joiners — is 12–18 months, not 2–3.
The efficiency savings that business cases attribute to Year 2 are frequently real in Year 3, if the organisation sustains the training infrastructure through the full recovery period. Most do not. Training investment drops post-go-live because it is viewed as a launch activity, not an ongoing operational requirement.
Customer Migration Friction
Digital transformation programmes in African banking routinely underestimate the effort required to migrate customers from old channel behaviours to new ones.
The assumption in most business cases is that customers will naturally migrate to digital channels once they are available — and that the efficiency savings come from reduced branch transaction volumes as digital adoption increases. This assumption fails to account for the structural factors that shape customer behaviour in African banking markets:
Trust dynamics around digital channels. In markets where mobile money fraud is well-publicised and where customers have personal experience of system failures during previous bank migrations, digital channel adoption requires active trust-building — not just availability. Customers who experienced system outages during a previous core banking migration at the same institution take longer to adopt digital channels at the next one.
Feature parity gaps. Digital channels launched during a migration programme rarely achieve feature parity with the branch on day one. Complex transactions — trade finance documentation, multi-signatory corporate accounts, collateral management for SME lending — remain branch-dependent until digital workflows are fully developed. The customers with the highest value to the bank are often the last to fully migrate.
Distribution of digital literacy. A bank's customer base in Lagos or Nairobi is not homogeneous. Corporate clients, urban retail customers, and rural SMEs have different starting points for digital adoption. A single migration timeline for the full customer base does not reflect this. Banks that run differentiated migration strategies by segment — with dedicated support for high-value business clients and longer transition periods for digitally underserved customer groups — achieve better outcomes. But the differentiated approach costs more and takes longer than the business case assumed.
How to Model Realistic ROI for Core Banking Migrations
The corrective is not to produce a more pessimistic business case. It is to produce a more honest one — which is, paradoxically, more useful for securing board approval, because it does not set the programme up to fail against its own projections.
Model savings by organisational readiness phase, not by technology delivery milestone. Efficiency savings do not accrue when the technology goes live. They accrue when the organisation has changed enough to operate differently. Map the savings to the change management milestones: first branch network at target competency, customer migration past 60 percent digital, parallel running decommissioned. These are later than go-live. The business case should reflect that.
Include a parallel running cost envelope as a first-class line item. Budget for 12–18 months of parallel running, not 6. The IT costs are calculable — licensing, infrastructure, reconciliation headcount. Include the management overhead cost as well. Running two systems is not just an IT problem; it is an operational distraction that has a real cost in senior management time.
Model the productivity dip explicitly. If staff productivity drops 30 percent in Q1 post-go-live across 300 branch staff, that is a quantifiable cost. It belongs in the business case, not in the optimistic scenario assumptions. Programmes that model the dip explicitly tend to invest in the training infrastructure that reduces its magnitude — because the cost of the dip is visible to decision-makers.
Apply a regulatory timeline buffer specific to the local jurisdiction. If the programme requires central bank sign-off for legacy decommissioning, add 6–12 months to whatever the vendor's typical timeline is. Talk to the regulator early. Build the regulatory engagement plan into the programme structure, not as an afterthought.
What "Transformation Readiness" Actually Looks Like
For a mid-tier African bank with 200–500 staff, transformation readiness is a specific condition — not a sentiment, not a leadership aspiration, but a set of observable organisational characteristics.
Leadership clarity on what "transformed" means operationally. Not "we will have a modern digital bank" but "our cost per branch transaction will fall from $X to $Y, our customer onboarding time will fall from 5 days to 48 hours, and our back-office FTE-to-transaction ratio will be Z." If leadership cannot state the operational targets, the organisation cannot align to achieve them.
Middle management engaged, not managed around. The most common failure mode in African bank transformations is a programme that has executive sponsorship and frontline training but has not brought middle management along. Branch managers, operations supervisors, and team leads are the people who determine whether the new system is adopted or worked around. If they have not been consulted in the design, they will protect their teams from the disruption — which looks like compliance and functions as resistance.
HR processes aligned with the new operating model. Performance metrics, job descriptions, and incentive structures still calibrated to the old operating model will produce old operating model behaviour regardless of what technology is running. Banks that realign HR processes in parallel with technology deployment — updating branch performance metrics to reflect digital channel migration targets, adjusting operations team KPIs to reflect new exception rates — see faster adoption. Banks that defer HR alignment until after go-live spend 18 months explaining why behaviour has not changed.
A test environment that mirrors the production context. User acceptance testing conducted in a sanitised environment with power users and project team members will not surface the failure modes that appear in a rural branch with intermittent connectivity and a junior teller under customer pressure. Transformation readiness includes a realistic UAT programme that tests the system in conditions that reflect actual operational context.
The Number That Actually Matters
Most digital transformation business cases lead with the ROI — the payback period, the cost savings, the efficiency ratio improvement. The number that actually determines whether those projections are achievable is not in the business case at all.
It is the change management investment as a percentage of total programme cost.
In programmes I have seen deliver against their business cases, that number is 12–18 percent. In programmes that delivered the technology but not the outcomes, it is 3–5 percent. The correlation is consistent enough that I now use it as a rapid diagnostic: show me the change management budget, and I will tell you whether to believe the ROI projections.
The programmes that miss their ROI targets did not fail because the vendor oversold the technology. They failed because the business case assumed the organisation would change as a side effect of the technology going live. It does not. Change requires investment, structure, and sustained leadership attention — none of which are free, and none of which show up on a vendor's ROI calculator.
Building a transformation business case or reassessing a programme in flight? Request a consultation with Priya.
Priya leads Change Management & Transformation advisory at Aicura Consulting, specialising in digital transformation programme design, organisational readiness assessment, and change management for African and emerging market financial institutions.