A neural network is helping credit card users continue to shop even when the lender’s credit-approval network goes down.

What’s new: Visa developed a deep learning system that analyzes individual cardholders’ behavior in real time to predict whether credit card transactions should be approved or denied. The system can step in when a card issuer — generally a bank that normally would vet such transactions — suffers a network outage that makes it impossible to assess creditworthiness.

How it works: If a cardholder’s purchases are blocked, they might switch to another card, costing the bank revenue and possibly a customer. And if a miscreant tries to commit fraud, the bank stands to lose money. So Visa provides a backup system that predicts the decision in case the lender can’t due to software glitches, severe weather, or routine maintenance.

  • The new model is trained on the company’s database of historical transactions. It learns an individual’s normal behavior based on factors like spending history, location, and timing of transactions.
  • In tests, it matched banks’ decisions with 95 percent accuracy. An earlier, rule-based algorithm was half as accurate, according to a report by the Wall Street Journal.
  • Visa plans to make the service available for a fee starting in October.

Why it matters: Unlike, say, fraud detection, this model touches cardholders directly to improve the customer experience. It points the way toward public-facing models that personalize banking, credit, and other financial arrangements.

Yes, but: Visa declined to share details of its new algorithm with The Batch. Decisions to extend credit can be based on patterns in data that encode social biases, and an algorithm trained on a biased dataset will reflect its biases. For instance, an algorithm may decline transactions requested by a cardholder whose home address is in a neighborhood associated with defaults on loans, and accept those requested by someone with a comparable history of repayment who lives in a wealthier neighborhood. Large financial institutions are aware of this problem, but standards that specify what is and isn’t fair are still in development.

We’re thinking: The financial industry’s health depends on trust. That should provide ample incentive to define the fairness of automated systems in lending and other financial services. Efforts such as Singapore’s Principles to Promote Fairness, Ethics, and Transparency are an important step.

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox