The movie HER reminds us of the risks involved when AI models are used in finance. AI is evolving its tools, and financial firms are ever ready to adopt them as quickly as possible. It is prudent to confine them specific and limited tasks for as long as possible. That will acclimatize the user and regulators to their pros and cons.
LLMs and other algorithmic tools pose new challenges. There could be automatic price collusion or breaking of rules. It is an opaque operation difficult to explain.
The securities and Exchange Commission monitors the potential risks. CrowdStrike Holdings was responsible for IT crash. It was a cyber-security firm. It reminds us of the potential pitfalls.
Generative AI and some algorithms fall into a different league. However, they pursue copycat strategies, exposing the markets to sharp reversals.
The more sophisticated the machines become, the dicier the risk becomes. There could be collusion between algorithms. It could be deliberate or accidental. This is likely when RL is used.
There could be cases of dishonesty. A chatbot could trade anonymously. It could be fed inside information. Based on this, it could trade, though it is forbidden to do so. It can conceal the fact from human element.
AI is given a singular objective — maximise profits. It does so more clandestinely than the human beings. With the arrival of AI agents, the same task could be performed more dexterously. There could be compliance with the letter of the regulations rather than with the spirit of it.
Who is accountable if the machine shows idiosyncrasy? Some wail over loss of control in trading. They are reduced to being a DJ who plays the tune of the song. Could we hold the trader responsible who funds and employs these tools? Could we hold the IT department responsible? Or the supplier who provided this tool?
It is a reminder that we should be cautious and should not fall in love with our machines.