
Banking GenAI decisions must be fully explainable to maintain customer trust.
Today, innovative banks are pushing ahead with AI-driven business transformation. Many already have ‘Bot Libraries’ for specific tasks, such as translations, document summarisation and report writing. At this low risk level, staff teams can make useful efficiency savings. This is perhaps, the low hanging fruit for GenAI in most organisations.
However, in future the growth in risk could be exponential for Banks, where the analysis of financial information is used to derive decisions that impact directly on customers.
The governance capabilities that all Banks will need for GenAI applications, will depend on the ability to explain AI derived decision making at a data level.

Both are essential—and while explainability builds trust through transparency, understandability helps that trust translate to human experience.
Clearly, the challenge of explaining the maths behind an AI derived decision goes way beyond understanding the same decision at a banking process level.
Evolving & promoting meaningful GenAI Banking Industry Standards, that can be used to benchmark compliance, may be the way to maintain customer trust, even if explainability cannot be achieved. However, this will not be to easy sell to customers.
Further reading:

Leave a comment