In banking, bias reveals itself in subtle ways. Customers may experience an obstacle and find that discrimination is at the root of their challenge. They may face discrimination due to their race, age, religion, gender, sexual orientation, military service, or citizenship.
I am proud to present the PositivityTechⓇ intelligent platform’s proprietary Bias Index, a new tool that identifies prejudice within customer and employee complaints and makes it possible to respond to systemic discrimination.
Why did you decide to develop the Bias Index?
Each one of us has faced unequal treatment at some point in our lives, and it is wrong that discrimination continues to play a role in our banking. For example, in December, the media revealed that a black customer and a black employee at JPMorgan Chase were unable to gain access to the same opportunities as their white peers. As a result, the bank implemented mandatory diversity training and said that it “would pay more attention to employee complaints.” Today, the pandemic continues to highlight issues of discriminatory banking practices against minority business owners. While banks are doing their best to weed out discrimination, it continues to rear its ugly head.
Customers tell us what we need to know, but often indirectly. I created the Bias Index because I understand the value of customer voice data, I have expertise in the advanced analytics capabilities needed to extract this intelligence, and I am passionate about eliminating bias from our institutions. Bias is wrong and there is no place for it in our institutions’ decision-making processes.
Financial institutions need to address issues of bias in order to enforce fair banking practices, establish safe and sound lending practices, ensure compliance, and prevent lawsuits.
How does the Bias Index pinpoint discrimination?
The Bias Index is an AI predictive model that digests the contextual reference of a complaint to reveal the root of the complaint, allowing financial institutions to focus on repairing products or unjust practices.
Using the PositivityTech platform, financial institutions can uncover customer complaints that shed light on disparate treatment and their impact. Here are two examples of such complaints:
“I received a call from an unknown collection agency stating they were a collector for X… The representative on the phone had a very intimidating and condescending tone and threatened to garnish my wages if I did not make a payment that moment… My recent knowledge of X’s class action lawsuit gave me the courage to speak up about the injustice I faced as a military service member. I believe I was subjected to illegal collection practices from X’s use of administrative offset — a procedure whereby federal payments, such as social security, veterans’ benefits, and tax refunds are withheld to collect debt.”
“The employees refused to be sensitive to my pronouns’ and name change. As a result, my account was closed after years of torture from this credit card company. I’m still left with a punishment and a persistent reminder.”
With a well-designed and implemented program of self-testing, financial institutions can safeguard their customers. Yet, to truly prevent discrimination, banks need to do more than self-testing. With its proprietary tools, including the newly developed Bias Index, the PositivityTech platform pinpoints the root cause based on customers’ voices, predicts future unfair actions to ensure better decision-making, takes steps to prevent systemic discrimination, and proactively strengthens your institution.
Why is it normally so challenging for institutions to identify biases and systemic discrimination?
Interestingly, less than .02% of CFPB complaints identify discrimination as the reason for their complaint. Context matters. Subtle bias may be described in the financial institution’s action, or the interaction with the financial institution’s employee.
Furthermore, while the PositivityTech platform’s proprietary Severity Score captures the overall frustration within customer complaints, subtle bias is a sub-specialization. Complaints that contain words like “decency” and “bias” are 30-40% more severe than the average complaint. However, many of these complaints are not due to some type of prejudice.
With the PositivityTech platform’s Bias Index, we are activating the science of transforming negatives to positives in timely and impactful ways. If you are interested in learning more about how the Bias Index can help you identify and weed out systemic discrimination in your financial institution, please get in touch with me at firstname.lastname@example.org. I look forward to hearing from you.