A road sign with a question mark on it, on a foggy road. - question concept
Feature

Why the Adoption Rate for Explainable AI in Financial Services Is Expected to Grow

5 minute read
Phil Britt avatar
SAVED
Moving to “explainable AI” will remove much of the mystery around AI, and, as a result will drive adoption of more AI-driven services.

People have become very familiar with the term artificial intelligence (AI), but many of its users have only a rudimentary understanding of how it actually works. As a result, to date financial services and many other industries have yet to leverage its full capabilities.

For financial services firms, adoption of explainable AI could drive adoption of AI-related technologies from the current rate of 30% to as high as 50% in the next 18 months, according to Gartner analyst and vice president Moutusi Sau, adding that lack of explainability is inhibiting financial services providers from adopting/rolling out pilots and projects in lending and from offering more products to the “underbanked” — those who don’t seek banking products or services, many because they don’t think they will qualify. Moving to “explainable AI” will remove much of the mystery around AI, and, as a result will drive adoption of more AI-driven services experts agree. The Global Explainable AI (XAI) market size is estimated to grow from $3.50 billion in 2020 to $21.03 billion by 2030, according to ResearchandMarkets.

“Explainable AI is a method/technique in the space of artificial intelligence, where the solution can be analyzed and understood by humans. It is different than the traditional machine learning techniques, where developers often fail to understand why the system has arrived at a specific decision,” Research and Markets says, pointing to interest from numerous industries. 

Others don’t use traditional banks due to lack of trust, another area that explainable AI could eventually address, though the current drive for adoption is more geared toward helping users as well as regulators better understand the use of AI in financial services, according to Gartner.

The AI Trust Factor

“There are a handful of reasons why a financial services company would want its AI to be explainable and the primary reason is trust,” added Clint Lotz, TrackStar.ai president and founder. “We need to be able to explain our AI to our clients, because they are lending consumers money based upon our data. We don’t want to cause more risk to our client, being that it is already a sensitive transaction, so being able to clearly explain what the AI is doing is vital to our success.

Eventually, explainable AI will work its way down to consumers, so they have a better idea of how AI was used to market a product to them and not to someone else or to deny an application for a loan or another financial product, but currently consumers are only given basic, rudimentary information like low credit scores, low debt-to-income ratios, etc., according to Sau.

Related Article: What Is Ethical AI and Why Is It Vitally Important?

Learning Opportunities

Historic Slow Adoption of Technology

Financial services firms, however, have been relatively slow to embrace explainable AI due to a robust legacy infrastructure in lending, mortgage servicing and many other areas. Such slow adoption of new technology is nothing new for the industry. Financial services firms were slow to add the internet, mobile services and other technology, typically behind retail and many other industries. The reasons were typically two-fold: legacy technology, some of which had yet to be fully depreciated on the books, and regulatory considerations.

Related Article: Finserv: 2 Tips for Balancing Fraud and Customer Experience

Removing Bias From AI

Explainable AI is also important to meet requirements of regulators, who want to ensure that any use of automated decisioning doesn’t include any bias, Sau added. Simple automated decisioning based on credit scores a person’s income and similar data, isn’t an issue — those systems have been in place for decades.

But as machine learning becomes more imbedded in extending product offers and in approval/denial decisions, machine learning becomes more important from a regulatory standpoint, according to Sau. “When you start using AI without letting regulators know how it works, that’s where your problems begin. They want to be clear that you are not discriminating against anyone, when you're using those algorithms in the decision-making process [for lending, etc.].”

Once adopted, any explainable AI model will need to be adjusted, Sau added. “All deployed models must be monitored for concept drift periodically, and firms must adjust model coefficients accordingly. A well-designed and monitored model reduces systemic bias and helps increase financial inclusion. Although it is not a perfect solution, explainability is a prerequisite to increase diversity and inclusion.”

Related Article: Make Responsible AI Part of Your Company's DNA

Improving Adoption

Sau said that financial services companies that see the value of explainable AI and want to adopt the technology need to take the following steps:

  • Determine the requirements for explainability, including diversity and bias, by consulting with lines of business and legal teams.
  • Ensure stakeholders appreciate the necessity of a trade-off between explainability and accuracy.
  • Create programmatic limits that keep AI systems within acceptable bounds.
  • Employ techniques, such as the use of interpretable models as proxies, to explain an AI model’s  decisions to help build trust among internal risk management   stakeholders such as the legal, compliance and enterprise risk functions.

About the Author

Phil Britt

Phil Britt is a veteran journalist who has spent the last 40 years working with newspapers, magazines and websites covering marketing, business, technology, financial services and a variety of other topics. He has operated his own editorial services firm, S&P Enterprises, Inc., since the end of 1993. He is a 1978 graduate of Purdue University with a degree in Mass Communications. Connect with Phil Britt:

Main image: Shutterstock