To reduce legal costs, the US bank JPMorgan Chase will compile data to identify employees at risk before they go into action.
Science fiction is invited in the banks. The financial giant JPMorgan Chase is currently deploying a computer program responsible for identifying rogue employees before they pass the act, according to the news agency Bloomberg . The software takes into account multiple data: if employees do not participate in compliance training, if they overstep the rules around transactions for their own account, if they push too far market risks …
“It’s very difficult to watch hundreds of data and to speculate on a trader or a particular team,” said Bloomberg Sally Dewar, director of regulation in Europe, chapote the project. “The idea is to refine these data to predict patterns of behavior.”
The monitoring program is tested in the division of financial transactions and will be extended to investment banking and asset management in 2016. JPMorgan did not say what action would be taken if doubts are raised about an employee.
Attacked in court for financial scandals, the big banks saw their profitability decline with the increase of fines increasingly substantial recent years. Regulators accuse them of not quite monitor or control the dubious practices of their own employees. Since the financial crisis, JPMorgan Chase has spent more than $ 36 billion in legal fees. In 2013 in particular, the group paid a fine of $ 920 million (€ 679 million) to end disputes due to losses after risky transactions French trader dubbed “the London Whale”.
Banks therefore rely increasingly on self-regulation, monitoring emails and telephone conversations of their traders via computer programs capable of processing large amounts of data. “We use the technologies developed for the against-terrorism on human language, because it is there that reveal the intentions of each” details from Bloomberg Tim Estes, CEO of Digital Reasoning Systems which counts among its clients and Goldman Sachs Credit Switzerland. “If you want to be proactive, you need to intercept people before they act.”
Sally Dewar nevertheless acknowledges that the software will never be 100% reliable, because of the human element that includes incompressible risks. No doubt it also raises ethical questions.
No comments:
Post a Comment