Why businesses need explainable AI—and how to deliver it
As artificial intelligence informs more decisions, companies’ AI systems must be understood by users and those affected by AI use. Actions in two areas can maximize AI’s benefits and minimize risk.
Businesses increasingly rely on artificial intelligence (AI) systems to make decisions that can significantly affect individual rights, human safety, and critical business operations. But how do these models derive their conclusions? What data do they use? And can we trust the results?
Addressing these questions is the essence of “explainability,” and getting it right is becoming essential. While many companies have begun adopting basic tools to understand how and why AI models render their insights, unlocking the full value of AI requires a comprehensive strategy. Our research finds that companies seeing the biggest bottom-line returns from AI—those that attribute at least 20 percent of EBIT to their use of AI—are more likely than others to follow best practices that enable explainability. Further, organizations that establish digital trust among consumers through practices such as making AI explainable are more likely to see their annual revenue and EBIT grow at rates of 10 percent or more.
Even as explainability gains importance, it is becoming significantly harder. Modeling techniques that today power many AI applications, such as deep learning and neural networks, are inherently more difficult for humans to understand. For all the predictive insights AI can deliver, advanced machine learning engines often remain a black box. The solution isn’t simply finding better ways to convey how a system works; rather, it’s about creating tools and processes that can help even the deep expert understand the outcome and then explain it to others.
To shed light on these systems and meet the needs of customers, employees, and regulators, organizations need to master the fundamentals of explainability. Gaining that mastery requires establishing a governance framework, putting in place the right practices, and investing in the right set of tools.