After governments, which mainly use artificial intelligence (AI) for military and security applications, the banking and financial industry is among the leading sector that is adopting AI and machine learning. The financial services industry (FSI) has always generated and stored large volumes of structured data that, according to a recent McKinsey survey, can become much more usable and valuable with the integration of AI.
Robots created by US company Portware choose the best investing strategy autonomously through the analysis of volatility and order flows in the financial markets. The iSentium neural system analyses web conversations to build a market sentiment for trading. These few cases are only some of the many examples of how financial markets are embracing the AI. Digital brains can scour and use big data to an astronomically faster degree than any human could ever dream of. Only this growing computational power has made possible storing and analysing of an enormous volume of financial transactions and other unstructured data in few seconds. This in turns opens up previously unexplored and innovative applications and ideas.
AI does not refer to any one single process or program. Artificial intelligence is a term used very loosely to describe various technologies. Robotic process automation aims to replace manual handling of automated processes for repetitive and high-volume tasks, while machine or deep learning uses statistical techniques to give computer systems the ability to “learn” with data without being explicitly programmed. Cognitive analytics, meanwhile, might be the closest thing to the popular conception of AI. It tries to mimic the human brain in making deductions from vast amounts of unstructured data.
Sidelining Irrational Actors
Leaving investing decisions to robots may help overcome the problem of irrational investors. There is a recent branch of economic studies, known as behavioural finance, which refutes the idea that rationality lies at the centre of decisions made by either consumers or investors. According to these set of theories, cognitive and emotional biases have a much more profound impact on the individual decision-making process, leading to irrational investing choices.
According to an Opimas research, in 2017 FSI investments in AI-related technologies amounted to around $1.5bn. The trend is forecast to grow, with the sum predicted to top $2.8bn by 2021. It is not by chance that the Financial Stability Board (FSB) emphasises how massive funds, managing more than $10bn, make extensive use of algorithms and machine and deep learning.
Embracing AI does have some drawbacks. Algorithms can go rogue. They can go wrong or behave contrary to their intended purposes, which can result in incorrect orders being placed and a tremendous potential loss. There have been such cases covered in the press in recent years. A Knight Capital algorithm went out of control and lost the firm $440m in only a few minutes in 2012. The firm’s computers rapidly bought and sold millions of shares well outside the market price in over a hundred of stocks for about 45 minutes after the markets opened. Algorithms run at a very high speed and detecting these problems rapidly can be very challenging.
Moreover, it could be sometimes difficult to understand how AI tools work. Let us think about the so-called “black box” process. Take for instance a neural network. This type of system provides operations in the way that an input is processed by a node and then, sent in parallel to the other nodes. We have a first output that will be passed to the next level where the process is repeated. From level to level, the network arrives at a final output that should be coherent with the initial input.
The problem here is that we will see the initial input and the final output, but we don’t know the intermediate steps that will be taken autonomously by the neural network. It may seem a stupid simplification, but it can help to understand the overall complexity of the topic. It could keep us from trusting AI tools. To give AI technologies a sense of maturity, we will need to implement methods to check, debug, and understand the decision-making process of machines.
Lastly, AI could bring standardisation to investing strategies. Eventually, one algorithm could rule them all, and be adopted by all players. Institutional and private investors could “flatten” on the same variables to analyse. The risk here is about a herding behaviour with the consequences of amplifying market bubbles and crashes.
The Human Role
AI-related techniques have not yet reached maturity, and there is not yet a full understanding of all its possible uses and implementations. Financial firms are still learning about it and which use cases could deliver the most value for them. In the meantime, the biggest players in FSI are investing a significant part of their annual budgets on the development of algorithms, machine learning systems, neural networks, etc., while smaller trading firms or individual investors are not able to keep pace.
Although there are still many areas to tackle to increase the acceptance and adoption of AI, from stricter government regulation, and compliance to monitoring in order to prevent failures, anomalies, and other biases in data and algorithms, humans will always have a critical role to play in both the design of an AI application and also in any final decision taken in relation to the relevant business outcome. This, in turn, emphasizes that AI will most often serve as an input into a decision made by a human, rather than a substitute for it.
Have your say. Sign up now to become an Author!
More on AI
Data Science: The New Leadership Skill
Technologies such as artificial intelligence, big data, and data analytics are fast becoming the revolutionary tools to help society and...
Automation: To Panic Or Not to Panic, That Is the Question
The dramatic headlines have been, and will continue to be, inescapable: “Robots Will Destroy Our Jobs — and We’re Not...
Artificial Intelligence Will Forever Change The Battlefield
Most have by now heard that Google is facing an identity crisis because of its links to the American military....