Sponsored by: ?

This article was paid for by a contributing third party.

Mifid II and the Emergence of the Connectivity Hub

Mifid II and the Emergence of the Connectivity Hub
Richard Bentley - Ullink
Richard Bentley, Ullink

While the second Markets in Financial Instruments Directive (Mifid II) go-live date may have passed with more of a whimper than a bang, the longer-term implications are already becoming apparent. This has prompted the need for a rethink of the way technology platforms are architected to ensure firms can achieve and maintain compliance as they continue to evolve.

Data management and connectivity are emerging as two of the most prominent long-term challenges of the new regulation. In a recent study, sponsored by Ullink, half of respondents said reporting and transparency requirements would have the greatest impact on their businesses. Firms said they were struggling to understand exactly what data must be reported, when and in what format, as well as how to extract this data from their existing systems. The collection and normalization of data from multiple sources present major hurdles, and connectivity to new trade and transaction reporting facilities requires significant investment.

If we examine these problems, it becomes increasingly clear that dealing with the data management and connectivity processes individually for each trading system can only be a temporary fix, and that a more scalable, industrialized solution is required.


Main Drivers

Looking at the main drivers of the data management and connectivity challenges under Mifid II, the number and diversity of trading venues for order execution is increasing, with new block-trading or large-scale facilities emerging alongside multilateral trading facilities, organized trading facilities and systematic internalizers. Half of survey respondents expect the number of liquidity sources to increase as a result of the regulations. Liquidity fragmentation requires more venue connectivity, and this extends beyond traditional exchange-traded products—take the vast expansion in numbers of electronic fixed-income venues over the past few years, for example. Each of these venues introduces its own market models, workflows and protocols on top of base connectivity requirements.


The Burdens of Mifid II

Post-trade Mifid II introduces the need for four-eyes‑style risk management, whereby firms must reconcile their trading records with those provided by their venues and brokers. Connectivity is required to consume this information via so-called drop-copy connections with the same complexity of different protocols and data formats.

Firms must also submit trade reports in near‑real time to the new approved publication authorities (APAs). A dozen APAs are already in operation covering different asset classes and geographies, each with their own requirements for connectivity and workflow. Although Mifid II allows more time for transaction reporting to approved reporting mechanisms (ARMs), even more extensive data enrichment and normalization is required before submission, and the same plethora of ARMs exists with different data format requirements.

There is also the considerable burden of referential data management. For example, the list of eligible instruments falling under the scope of the Directive can be updated daily, as can the list of firms acting as systematic internalizers for each of these instruments—all of this information is required for accurate reporting. Firms can source this data directly from the European Securities and Markets Authority or from a third party, such as a market data provider, but each approach introduces new connectivity challenges.


Data Management and Connectivity

These are just some of the implications for connectivity and data management. To meet the Jan. 3 deadline, many firms have been forced to adopt tactical approaches, which often involve spreadsheets and manual workarounds to extract and normalize data from existing systems. There may currently be a degree of forbearance, but regulators will become stricter as the year goes on, and such approaches will not scale to meet future demands.

The consequences of Mifid II in data management and connectivity require an industrialized response, which research organization Greenwich Associates has called an “enterprise-scale solution”. A key characteristic of Mifid II reporting requirements is that they are holistic. However, when we look across the technology landscape of most market participants, we typically see a complex mixture of vendor and homegrown systems, with multiple applications, databases and workflows. If firms deal with the data management and connectivity processes individually for each front-end application, they will end up with duplication, inefficiency and a tangled “spaghetti” of point-to-point connections and transformations. 


The External Connectivity Hub

Alternatively, data management and connectivity functions can be lifted out of individual front-end applications and centralized in an external hub. Such a data management and connectivity hub will require a number of key  capabilities.

Support for a diverse range of connection methods and protocols is essential. Although many venues and an increasing number of vendor applications speak FIX, this is not universal and FIX is less a common language than a common representation with some basic rules, with considerable variability across its many versions and dialects. This hub will also have to support proprietary application programming interfaces (APIs) and exchange formats, including stalwarts such as traditional comma‑separated value files as a means of integration with older systems.

Once the data is gathered from the different sources in various formats, it must be transformed into a single coherent representation to be collated, processed and eventually fed out to all the required monitoring and reporting channels. Decoupling this data normalization process from the individual trading systems is essential to achieving the level of industrialization required. 

The Directive also requires more information about trades and transactions than front-end systems traditionally hold. For example, it specifies around 60 new fields of referential data—such as details of individuals and systems involved in investment decisions—that must be added to transactions before they are reported. This enrichment process requires extracting data from a further range of sources and collating it with the trading data. Details of this enrichment process must be externalized from individual applications to provide the flexibility to make changes without disrupting other workflows.

Under the new regime, failure to report exactly as required will result in heavy penalties. The “if in doubt, report everything” approach will no longer work as overreporting will be considered as bad as underreporting. Validation of data will be essential for reporting, and the inexorable drive to real-time reporting will demand automation of this process. The externalization and enforcement of data validation rules is a key requirement for the connectivity hub.

It is not just external connectivity that must be considered. Mifid II requires firms to capture and log all reportable events over the lifetime of a client order. Receiving an order, splitting it into child orders, sending the child orders to an algorithmic trading box and undertaking pre-trade risk analysis are all reportable events—and they happen across a multiplicity of different systems. Pulling all relevant events together to create a chronological biography of a client order requires internal connectivity, with the normalization, enrichment and validation associated with the different systems involved. 

Once data normalization, enrichment and validation are centrally managed, huge benefits accrue to this approach—the ability to create filtered subsets of the data, possibly in different formats, and route them to different consumers, for example. One subset may be required for trade reporting, another for end-of-day transaction reporting, another for best‑execution analysis and another for trade surveillance. The ability to take a unified set of collated data and generate multiple feeds for various applications and workflows is key to the industrialization of data management.

Beyond these functional capabilities, an external and centralized connectivity hub needs a number of additional capabilities. High availability and fault tolerance is essential—if a firm has 15 seconds to report trades, it cannot afford to have its hub out of action for 30 minutes. Performance at scale is also key—Mifid II mandates near-real-time reporting and entails huge volumes of data.

Flexibility is essential. The ongoing evolution of market infrastructure means firms will need to modify or create new data mappings and validation rules promptly, and connect more sources and consumers without impacting overall operations. We have seen in the months leading up to the Mifid II go-live the high cadence of API updates for venues and APAs alike. In a hub architecture of many-to-many connectivity, processes must be independent and able to be adapted quickly and in isolation.


Conclusion

The data management and connectivity challenges of Mifid II mean firms must think about lifting these functions out of their individual systems and externalizing them in a centralized data management and connectivity hub that can scale to meet demands and is flexible and adaptable to ongoing market and regulatory evolution. This is key to achieving industrialization of compliance.

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here