To discuss this potential, Euroclear hosted a Community Session at Sibos 2017 entitled 'From data disorder to information insights'. The session asked leading experts to share their perspectives on how data can be transformed into information and insights that improve decision making and provide the basis to create new ways of adding value.
From data disorder to information insights
The role of data in delivering new services and enhancing – or even transforming – existing ones has become one of the dominant themes of 21st century commerce.
Already highly automated, the global financial markets have consumed and produced substantial volumes of data for decades.
But the recent wave of technological innovation – big data analytics, Artificial Intelligence (AI), Machine Learning (ML), blockchain and cloud computing – promises a quantum leap in power and scale, which can help firms generate new efficiencies, insights and value from data.
Despite the pace and breadth of innovation, panelists acknowledged several challenges to the efficient use of data. But the fast-expanding ability to capture, aggregate and analyse data also gave cause for optimism.
Giles Elliott, Head of Business Development for Capital Markets at TCS Financial Solutions, identified three barriers for leveraging data more effectively today. “First, the industry has a transaction-centric data model. We enrich data to support the needs of the local layer, but leave most of it behind as we move up and down the supply chain.”
Second, said Elliott, there is a high degree of fragmentation along the transaction chain both across counterparties and within service providers, accentuated by extensive use of business-centric platforms that do not consume, use or share data efficiently.
A third barrier to leveraging data more effectively concerns the accuracy and integrity of data as it passes through the transaction chain. "We’ve been very challenged as an industry to try and understand how we can move packets and parcels of data through a relatively fragmented and complex ecosystem of platforms, and still understand where we got that data from, and hence what confidence we should put in that data and how we use it," Elliott observed.
Regulatory change can also have diverse and even conflicting impacts on the finance sector’s use of data, including not only what is generated and by whom for compliance purposes, but how it is stored and distributed as well. Ruth Wandhöfer, Global Head of Regulatory and Market Strategy at Citi Treasury and Trade Solutions, identified a shift by regulators from a focus on transaction reporting to a "broader requirement to be aware of key data across business lines" to monitor and counteract key systemic risks.
Noting the upcoming EU General Data Protection Regulation (GDPR), Wandhöfer flagged regulators’ concerns over the ability of retail banks and other consumer-facing firms to capture client data, resulting in measures to help individuals assert greater control over the data held on them. Moreover, from a prudential regulation perspective, she said, increased capital requirements have incentivised banks to use data to demonstrate understanding of business risks.
"But firms are only just getting their heads around the exponential increase of unstructured data now becoming available, partly because it takes time to develop and refine scalable solutions that use natural language processing and self-learning algorithms," said Wandhöfer.
With an increase in the amount of data available, the question then becomes how can it be captured, aggregated and transformed to generate insights and valued-added solutions.
Peter Golder, Global Head of Euroclear Information Solutions, suggested emerging new data models need to account for a diverse range of data types, business challenges and operating models. "Data is not monolithic," he said, agreeing that much data is currently unstructured, and thus poses specific analytical challenges, while much else is 'dark' or redundant.
While certain aspects of data handling are now cheaper, for example data storage, "the actual cost of managing data diligently is going up," Golder observed, noting also the varying shelf life of data types due to regulatory requirements such as GDPR.
Golder said data management models must also account for the specific nature of the business problems being tackled – drawing a comparison between commodity products and finished goods, in terms of value-add – and the characteristics of prevailing operating models.
Noting historical tensions between business units and IT departments over control of data, he warned: "Decisions on these kinds of issues can impinge on your ability to deliver value to clients or generate revenue from data."
Faced with a gushing pipeline of data and multiple potential opportunities to refine and channel it, how can technological innovation help to harness data’s business-transforming and revenue-generating power?
As Chief Business Development Officer at blockchain technology developer Digital Asset Holdings, Chris Church knows the opportunities new technologies like distributed ledger present. But a career at data and standards-driven organisations including SWIFT and BT Radianz has engendered a keen appreciation of data management challenges.
"Technology has helped us to get to where we are. But this data stuff is hard. Data vendors are constantly looking at new technologies to come up with better data solutions, but there are many complexities," he observed.
Cloud-based services are reducing data storage costs, for example, but many regulators remain cautious. AI is contributing significantly to big data analysis capabilities, but it’s still difficult to build correlations around certain data types and sources.
Blockchain offers a single, universally-accessible record of transactions, and thus can bring more efficiency and transparency around data. Any new models must meet high industry standards, and "the technology can be deployed to ensure a high degree of privacy and security," said Church.
How, asked moderator Luc Vantomme, Euroclear’s Global Head of Innovation, do we shift from viewing data as a cost and a liability to an asset that can drive our businesses forward?
To illustrate this shift, Golder pointed to the growing number of use cases that are calling for smarter uses of data beyond 'reference data'. In the evolving regulatory and macro-economic landscape of the securities markets, market participants are focused on improving both liquidity and transparency, he noted. For example, combining inventory data and transactional data can bring supply and demand together more effectively in fixed-income markets currently suffering liquidity shortages, he said.
Having previously noted the power of data to reduce banks’ capital costs, Citi’s Wandhöfer pointed out the value of data in improving strategic decision-making. "You can only be strategic if you know what you have and where you want to get to," she said. "If you get data right, it can give you a better view on client behaviour, market developments, macro-economic trends, and the impact of regulation." Only then, she concluded, can you focus on changing your business model.
To truly optimise data’s potential, said Golder, firms must look beyond its ability to do specific functions faster, better and cheaper, and instead focus on measures of success from an organisational perspective. In some realms that might mean the ability to develop and implement trading strategies; in others success may derive from the uniqueness of the data, or the ability to normalise or distribute it. "A framework for data success might need several criteria," he said.
The panel agreed that, while it’s still early days, improving the use of data is an area that will be on everyone’s radars for some time to come, not only to reduce costs, but also to develop revenue-generating services in an increasingly competitive financial landscape.