Investment firms and asset managers famously warn their customers that past performance is no guarantee of future outcomes.

Nevertheless, firms in global capital markets and elsewhere devote much time and energy to churning out internally focused information about past performance, rather than focusing their data and analytical efforts on increasing the effectiveness of corporate future-oriented decision making.

This is not to suggest there is no value in the (predominately internal) data these firms collect and analyse. However, it is somewhat one-dimensional, and limited in its ability to map the path to new market/client and product opportunities.

Forrester Research estimates that less than 0.5% of all data is ever analysed and used while a mere 12% of enterprise data is used to make decisions.

In fact, based on a recent EY survey, 70% of business users cannot access the data they need. Furthermore, Forrester reckons that a 10% increase in data usage for decision making could yield more than USD 65 million in additional net income for a typical Fortune 1,000 company.

There is significant potential in the expanding universe of data outside of traditional organisational boundaries to add additional dimensions and data – in particular client and market data – to increase the effectiveness of organisational decision making. However, this is not an easy journey and most of the heavy lifting is linked to organisational and cultural change rather than a data or analytical challenge.

Capital Markets Participants are both nervous and excited by the possibilities offered by 'Big Data', advanced analytics (for example, Machine Learning (ML) and Artificial Intelligence (AI)) and other digital innovations to harness multiple sources of structured and unstructured data.

The excitement stems from the prospect of deeper analytical insights, more accurate forecasts/predications and ultimately improved outcomes for clients often in the form of 'better' recommendations.

Concerns arise from the significant challenges involved in maximising this opportunity, the cultural change required to adopt an analytical and data driven mindset combined with an inherent 'biases' found in many of the underlying data that may impact client outcomes and corporate decision making.

Garbage in garbage out – the saying still holds true in a digital environment

Concerns about the fact that data may not be neutral or unbiased are well-founded. New sources of unstructured data are not necessarily 'clean' or formatted or complete by prevailing industry standards, and as such are hard to incorporate into existing workflows and system infrastructures.

Many data sources contain 'noise' – non-relevant data – that needs to be cleansed from the data set before being used.

But, these large, and complex real-time data sets – when used appropriately – are powerful and provide an external data dimension to corporate decision making – after all clients, competitor reaction to corporate strategy is difficult to decipher purely from internal data.

Accordingly, firms are already augmenting internal data and analytics capabilities with third-party resources across a range of activities; for example, investment firms are rapidly deriving insights from 'alternative data', like, sentiment data and satellite/geolocation data to inform their investment and asset allocation strategies.

The future of predictions/forecasting

In recent years we have seen non-traditional external sources of information take their place alongside or even superseding more traditional internal inputs, for example, a firm featured/choreographed product ‘review’ has given way to independent customer reviews.

Equally, the ‘build it and they will come’ traditional school of product development has been mothballed-replaced by processes and principles based on agile and lean techniques that focus on validated insights from external sources, notably customers to develop a Minimum Viable Product (MVP) that meets needs of early adopters.

Furthermore, this shift towards external data driven forecasting and decision making is making use of the collective wisdom of crowds that can provide faster, better and cheaper predictions than most corporates.

Netflix famously crowdsourced a recommendation engine by offering a reward to third-party developers, saving itself both time and money. The common thread of these novel real time approaches of decision making is based on less structured/formalised organisational procedures and relies more on 'open-sourced' stakeholder input, which is more reflective of the experience and activity of the many, not the few.

As such it can provide a more reliable basis for predictive analysis and decision making.

Furthermore, due to greater competitive, and regulatory pressures, the need for non-traditional and complementary external data sources combined with sophisticated Artificial Intelligence (AI) and Machine Learning (ML) algorithms is expected to grow significantly over time.

Case Study - Fixed Income/Credit Markets

In the fixed income/credit markets for example, buy- and sell-side firms have had to adjust to reduced levels of liquidity in fixed income markets following the introduction of Basel III’s regulatory standards and capital constraints.

For less-liquid bonds, which might not have traded in months, traditional 'inging round' approaches are expected to deliver little actionable information, nor will isolated historical transaction records yield the systematic insights required to identify and ultimately match latent liquidity.

Instead, fixed income/credit market participants are looking beyond structured, internal data to enhance their view of market dynamics, obtain a better understanding of both their macro/'big-picture' view with tradable micro 'up-close' intelligence.

Accordingly, both depth and breadth are relevant: for example, once an Indication Of Interest (IOI) has been identified through a 'telescope', microscopic details related to a specific security drive liquidity considerations and trading decisions.

By aggregating recent trading activity of a single security across the entire fixed-income trading ecosystem – prices, flows, order sizes, volumes, anonymous/aggregated buyer and seller characteristics such as concentrations – firms can derive a suitable execution strategy based on supply and demand taking into account liquidity considerations.

This helps market participants to select the most appropriate trading and execution strategies for fixed income instruments with varying degrees of liquidity.

Increasing the effectiveness of your decision making

Many financial institutions may regard data as their lifeblood but in reality the data currently being collected, generated and distributed is rarely used or even suitable as the basis for benchmarking performance, shaping decisions or forecasting trends.

Many organisations (leveraging their internally focused ERP and CRM systems) tend to put too much weight on what was or is, at the expense of thinking about how data and analytics can be deployed to help understand what will be.

Augmenting internal, historical data with real-time, external sources is difficult, nor is there a single path to success (like, think twice before going big bang).

Of course, regulatory, and technical challenges must be addressed not to mention issues stemming from cultural resistance and organisational inertia which are often the most difficult hurdles to overcome.

However, firms have much to gain by gradually adopting real time decision making capabilities that are based on advanced analytics using an appropriate mix of internal and external data.

In the case of fixed income trading, a more effective decision-making framework, could make use of predictive analytics based on leading not lagging indicators to identify trading opportunities and ultimately as a means to develop and maintain competitive advantage.

Decision making using AI/ML

Despite the renewed popularity of Artificial Intelligence (AI) and Machine Learning (ML) as a subset of AI, at present few pilots in capital markets have made it into production to yield real business value (Forrest Research estimates that 15% have successfully migrated from pilot into production).

In part this is due to :

  • the overall low level of quality of data in the industry – at both firm and industry level where a plethora of standards continue to coexist – which adds additional complexity and thus potentially reduces the potential value from deploying an(y) AI/ML algorithm
  • and the subsequent lack of consistent data at scale.

Extracting real value from data is predicated on the proper organisation, clarity and veracity of data across firms in a network.

In particular, ML requires a consistent data ontology and hence solving any ML problem can become very challenging if different firms store data differently.

Organisations should be able to address these requirements through a two-step process:

  1.  firms to consolidate their in house data in a move away from a proliferation of data silos while maintain a data fabric that is integrated with legacy systems (according to EY, 75% of enterprises surveyed rely on 6 or more reporting/ERP systems, while 20% had more than 15 systems)
  2. these firms then join a common capital markets data platform which in turn would allow an individual firm to make decisions based not only on its own data but rather draw on the data of a much larger pool of (sufficiently aggregated and or anonymized) information to achieve outcomes that no one firm could achieve alone. 

Accordingly, the full potential of AI/ML is difficult to reach solely by using the internal data of an organisation, instead firms should aspire to apply AI/ML across a network of industry participants using a common data ontology across asset classes and a network of capital markets firms.

Incorporating third-party data into your decision-making processes will not guarantee certain outcomes, but it will improve the accuracy of your predictions and provide a more reliable, effective basis for decision making and actions.

More related news and insights