• Axiom Community Resources
  • Protolabs
  • Rydoo
  • Axesor
  • IBS
  • Docfinance
  • everis
  • enhesa
  • Enterprise Greece
  • KOFAX
  • Amina
  • Amina
  • Wolter Kluwer
  • Aptitude Software

Computational Finance

Computational Finance

The aim of this white paper is to present the business opportunities afforded by recent progress in financial modeling and advanced technological solutions, and the regulation mechanisms that must be put in place to mitigate the risk of uncontrolled volatility in capital markets.

Introduction:

“I think the machines just took over. There’s not a lot of human interaction”. The above conclusion was given by Charlie Smith, Chief Investment Officer at Fort Pitt Capital Group, after the May 6 2010 stock market crash, when the DOW Jones took the single largest plunge in its history. Smith said, “Whatever started the selloff, automated computer trading intensified the losses. The selling only led to more selling as prices plummeted and traders tried to limit their losses”. Following the crash, what had only been known by a limited circle of financial trading specialists, was suddenly disclosed to a wider audience: new mathematical algorithms, the latest highperformance computer systems and highfrequency trading are taking over from human stockbrokers [1]. It is estimated that at the New York Stock Exchange (NYSE), more than 60 percent of trades are currently carried out without direct human intervention. Since the introduction of computers and the rise of electronic markets in the 1970s, finance has evolved clearly and irrevocably toward ever-more sophisticated trading mechanisms. Over three decades, scientific and technological advances have drawn a prosperous circle around the field, boosting market places and their actors with silicium computational power. In the early 2000s, the biggest banks and hedge funds developed and expanded algorithmic trading. Complex investment strategies and orders, which would have previously needed several days to process and experienced specialists to process them, could be settled in minutes or hours, with almost no human interaction. Apart from pure technological developments, exogenous events, such as decimalization and regulatory changes, supported the trend toward automation, allowing the invention of new mathematical models and algorithms which could take advantage of the micro-structure of the new financial markets that had been digitally transformed. The current focus is on so-called high-frequency trading (HFT), a sub-class of algorithmic trading strategies aimed at taking advantage of highperformance computing and low-latency communication networks. Hundreds of stocks are bought and sold in the blink of an eye, with some orders lasting only a handful of microseconds. Recent events, such as the May 6, 2010 flash crash made the trend for highperformance computing headline news and led people, from the simple newspaper reader to regulatory institutions, to try to figure out just how prevalent HFT really is. The fact is, no one really knows.

It is not known how much of the global market capitalization of the almost €40 trillion [2] and more than 12 billion shares traded in 2010 across the world came from algorithmic trading and to what extent HFT was used. Firms that use such techniques do not provide quantitative information about their computational strategies, mainly because trades are not tagged ‘algo inside’. Experts could also carry out reverse engineering, thereby annihilating a competitor’s advantage. However, analysts [3, 4, 5] and academics [6, 7] are conducting studies which draw a clearer picture: ` More than three-quarters of trading companies and banks in the US use algorithmic trading. ` In addition to western countries, market places in Asia and Brazil for example are eligible for HFT (due to regulatory developments, infrastructure investments, etc.). HFT is so far practiced by a minority, but represents between 50 percent and 75 percent of the global trade volume. ` The maximum estimated benefit per year for HFT players is around €15bn with a more realistic target of €2bn per year. The gains to be had are clearly substantial, leading hedge funds (JPMorgan, Paulson & Co.), trading firms (Optiver, Getco), and market makers (Goldman Sachs, Merrill Lynch) to continue to invest to ensure they stay at the cutting edge. At the basis of the pyramid, mathematicians create new models and strategies which use fractal theory or artificial intelligence [8, 9]. Further up, computer scientists enhance algorithms to take advantage of cloud computing and high-performance computing [10]. Network specialists then improve the infrastructures to speed up communication [11]. Lastly, at the top, financial analysts improve their predictions using simulations and new data sources, such as blogs or even opinions [12]. Algorithmic trading and HFT are two current branches of computational finance where technological excellence meets intellectual expertise. However, more and more people are alleging that algorithmic trading, mostly HFT, is insidiously affecting the market microstructure and dynamics. This is more of a strong conviction than a fact, but evidence seems to support it. Firstly, the nanosecond rush to lower latency as much as possible means systems must be plugged directly into the venue where the trade is executed, creating a de facto bias in fair access to stock exchanges. Secondly, for competitive reasons, trading strategies are black boxes which make the work of regulatory institutions hard, despite the huge overall volume of trade.

Hundreds of stocks are bought and sold in the blink of an eye, with some orders lasting only a handful of microseconds. This problem is exacerbated by the presence of dark pools in the market ecosystem, which keep orders anonymous. Lastly, recent work shows that HFT, because of its consumption of liquidity and its principle of numerous tiny trades, slightly increases the standard value of the equity market index of dependence, aka its Hurst exponent [13]. This is not anecdotal; this index varies between 0 and 1 and is equal to 0.5 when the market motion is fully random, which is a classical hypothesis of market modeling theoreticians. If the value becomes bigger than 0.5, then random hypothesis is not true anymore: an increase in value will probably be followed by another increase, a decrease in value will probably be followed by another decrease. Computational Finance, algorithmic trading, high frequency trading: a Matryoshka (Russian nesting) doll whose impact on the market and its structure is significant. Regulators are tackling the situation hands on with questions such as: should there be a dedicated legal framework? Are dark pools acceptable? Should market places align their infrastructure? etc. The regulation initiative is broad and complex, mainly because finding a cure to a potential disorder requires a full understanding of the phenomenon and yet studies are still in progress. The next chapter of this white paper presents the mathematical background of computational finance to help the reader better understand what is at the core of this industry.

Mathematical Concepts:

Practitioners of computational finance use various methods which rely on mathematical models and numerical techniques to make trading, hedging, and investment decisions, as well as facilitating the risk management of those decisions. These mathematical concepts and tools are the very basis of algorithmic trading and HFT. Below are a few elements of a typical workflow, a more in-depth view can be found in the appendix.

` Modeling: As the first step of any computational work on financial data, modeling is critical. It consists of constructing an abstract representation of an asset, an instrument, or a data flow which will provide the best (or the least bad) tradeoff between simplification (to make calculations traceable) and accuracy (to make results meaningful). A broad variety of models exist, based on probability theory, fractal theory, or even physics. Defining increasingly better models is an everyday challenge for academics and financial institutions.

` Forecasting: Although an accurate financial model can help understand the behavior of a stock in the past, the main interest is to guess how it will perform in the future, and that means forecasting. One of the key indicators used by analysts is volatility which measures how broad the variation of a price could be. In addition to volatility is liquidity which, in a very metaphoric way, indicates how fluid a market is. Why fluid? Because stocks are like water drops which flow more or less easily from sellers to buyers who had forecast a gain in the negotiation. The more liquidity, the more trades in the market, and forecasting is one of its faucets.

` Risk and portfolio management: Modeling and forecasting are difficult tasks. Markets are complex systems mixing rational facts (the results of a company, its investment, etc.), irrationality (a panic after a rumor, too much enthusiasm about a technology, etc.), and a good dose of uncertainty (an earthquake shutting down industry in an entire country). Risk is thus omnipresent and indicators, such as volatility, help investors avoid the holes. Typically, many studies have been conducted to define methods to balance risky positions with safer ones: portfolio management. This problem is multivariate in that the company’s performance, the competition, and also national policy, the weather and much more are all part of the equation.

Whether for modeling, forecasting, or risk management, technology mixes with science. Most financial decisions rely on numerical methods and simulations running on computers or in high-performance computing (HPC) centers. Frameworks based on these methods, such as RiskMetrics developed by JPMorgan and Reuters, offer a set of tools for financial markets to estimate their exposure to risk. The next chapter offers a quick glimpse at technology and particularly at how unstructured data could be used for optimal financial decision making.

Technological Solutions:

Structured and Unstructured Financial Data Flow: Current financial decision support systems deal mostly with financial data in the form of structured information (e.g. price time series, macroeconomic data, and fundamental and technical indicators). However, there is a vast amount of constantly growing information that is in a semi-structured or completely unstructured form: Corporate disclosures, media coverage, market reports and analyses, expert and nonexpert opinions on blogs and websites, and much more. Proper extraction and interpretation of relevant information and sentiments is crucial for identifying potentially risky or dangerous situations (market shocks or crashes), helping end users, such as financial analysts, investment managers, market regulators, financial advisors, and individual investors, to make optimal decisions, and detect fraudulent activities or manipulations. These challenges are addressed by the FIRST [15] project, that is developing a large scale information extraction and integration infrastructure to assist in various ways during the process of financial decision making.

Sentiment analysis: Sentiment analysis or opinion mining is a combination of natural language processing (NLP) and semantic techniques that are able to handle affective states (including opinions, beliefs, thoughts, feelings, goals, sentiments, speculations, praise, criticisms, and judgments) and attitudes associated with them (emotion, warning, stance, uncertainty, condition, cognition, intention, evaluation, etc.), which are the core of subjectivity in human language. This technique has been successfully applied to user-generated content coming from social networks and in on-line advertising. Sentiment-based analysis of opinions on securities, stock prices, etc. is useful in identifying and predicting complex marketrelated events and supporting the financial decision-making process. FIRST validates its innovations in three complementary case studies: market surveillance, risk management and online retail brokerage.

Grid Technologies and their Application in HFT: The term ‘grid’ evolved from a technical and high-performance computational infrastructure in order to recognize potential synergies between a grid and emerging service-oriented architectures (SOAs). Applied to the financial market, grid architectures enable the management of a huge amount of data gathered from different sources, which will be used to accurately simulate market development scenarios and investment opportunities. Examples of these capacities can be found in the use cases that resulted from the EU research project BEinGRID , four experiments addressed the specific casuistry of the financial sector: ` Financial portfolio management: A tool for financial operators to run simulations directly on their desktops to support strategic decisions, with no need for awareness of the advanced grid computing technologies powering the service. ` Risk management in finance: A grid solution within the financial sector. Financial institutions face computational time and resource challenges in the risk analysis of investment portfolios. These real-time financial algorithms, which use stock market and trading desk data, impose real-time processing constraints which are a major challenge for the grid. In this business experiment, a Monte Carlo approach was developed to determine the price of the guarantee based on risk analysis of an insurance product from a large insurance company. ` Data recovery service: The business objective of the experiment was to evaluate the commercial application possibilities of grid technology for an automatic on-line backup system and to assess adoption readiness in the small and medium enterprise (SME) market. ` Anti-money laundering in grid: Aimed at allowing banking organizations to cooperate in order to improve their anti-money laundering (AML) mechanisms. The system applied grid technologies to enable secure and managed communication between banks and supports each organization’s AML activities, while assisting in meeting new regulatory obligations.

Business Opportunities:

Evolution of the Market: As presented in the Business Challenges chapter, the use of algorithmic trading saw exponential growth during the last decade, and it is now massively applied by most trading companies. The value of the benefits it offers can be considered to be in the range of between €2 billion to €15 billion per annum, which provides a rough idea of the investment in technology and related services that trading companies are ready to consider. The market is still confused, partly due to financial institutions’ high levels of secrecy as they try to avoid unveiling strategic information - algorithmic models and technology involved - to competitors or regulatory authorities. Nevertheless, investors now have access to a large number of competing market places (emerging markets, electronic communication networks), leading to high fragmentation of liquidity. Decisions need to take into account advances in both systems and network infrastructure, including: ` In trading systems, the emergence of in memory databases and the increased role played by complex event processing, both for decision algorithms and smart order routing [18]. ` The use of low-latency financial information flow and of semi-structured or unstructured information, as well as technologies and how to take advantage of them. FIRST  ` Continuous improvements in the network infrastructure (low latency, connectivity to new venues). High market volatility suggests an approach based on cloud computing that is able to quickly adjust infrastructure capacity issues to the needs of the markets.

Legal aspects: The rise of algorithmic trading led regulation authorities and legislators to adapt their directives. The tricky point was to change the regulations in a way that would strengthen ethics and transparency in market access, without compromising the competitiveness of the places they provide regulations for. High capital mobility renders a purely local response ineffective. In addition, significant coordination between international regulators and legislators is necessary. On a European scale, this led to the recent establishment of the European System of Financial Supervision. The regulatory authorities of the EU States have to implement the same EU directives (following the Markets in Financial Instruments Directive (MiFID), EU Directive on Transparency, EU Directive on Market Abuse, etc.), which means that the authorities, and the stock exchanges and banks that will have to comply have similar needs. This convergence offers an opportunity to deploy a standard service on a large scale. Having previously focused on stabilizing the global banking system, regulators are now interested in fighting the causes of the destabilization. As a result, many studies have been undertaken about the impact of algorithmic trading on markets. These studies, along with international coordination, highlight a number of similar trends: ` Revision of the solvency requirements of financial players (Basel III, Solvency II). ` Better control over activities and financial flows, involving the implementation of appropriate reporting systems and the ability to effectively analyze data, as well as more reactive monitoring of irregular transactions, bringing forth the opportunity to implement powerful algorithms to analyze and detect fraudulent transactions. ` Strengthening of codes of ethics and of the regulation of market abuse, for example with the EU Directive on Market Abuse. This could make some borderline activities forbidden (e.g. flash trading in the USA). Consequently, banks will have to improve control over their financial operations: Transaction monitoring, reporting, and data management.

Conclusion:

There is currently a unique combination of market expectations, maturity of scientific models and availability of new technological platforms: ` Market expectations: With annual benefits estimated to be €2bn for HFT alone (excluding all other kinds of algorithmic trading), financial institutions are ready to invest significantly in order to optimize the performance of their trading activities. ` State-of-the-art modeling: Advanced mathematical models have been developed to fulfill the principal expectations: Pricing, forecasting, risk management, and portfolio management. ` Several advanced technologies are opening up new opportunities: E.g. semantics and sentiment analysis applied on information flows, and HPC techniques applied to HFT.

In the near future, this combination will lead to the emergence of innovative products (e.g. new high-performance systems for HFT) and services (e.g. financial information or Simulation ‘as a Service’). The following Scientific Community publications are complementary to this whitepaper: ` Expecting the Unexpected, Business Pattern Management (whitepaper 2010) ` Alternative delivery model (challenge memo 2009) ` Transformational IT Outsourcing (whitepaper 2010).

Axiom Members

Follow us

Our tweets:

Axiom Groupe Welcome to @AxiomGroupe Conferences! Upcoming Events: Finance 4.0: https://t.co/CPp7MqZwZf Smart Payment:… https://t.co/sFwIKqIk0e
Axiom Groupe Welcome to @AxiomGroupe Online Community! Upcoming Events: Finance 4.0: https://t.co/CPp7MqZwZf Smart Payment:… https://t.co/Mb4L8sQLmv

Upcoming Events

Cart