Reporting requirement aspects of European EMIR regulation are spurring service provider offerings, along with concerns about legal entity identifier issuance. Also, wikis present a data sourcing choice for the industry.
As this issue of Inside Reference Data reaches you, a key provision of the European Market Infrastructure Regulation (EMIR) requiring identifiers for trade reporting will be taking effect (on February 12 to be exact). The number of EMIR stories in this issue's news pages is the most related to a single development that I can recall appearing in a single issue since I became editor nearly three years ago.
We see the EMIR deadline already creating a crunch for pre-legal entity identifier (pre-LEI) registrations, raising concerns that the waiting times for issuing these codes will collect in a bottleneck. Pre-local operating units in the US, UK and Germany, such as the Depository Trust and Clearing Corporation (DTCC), Swift, WM Datenservice and the London Stock Exchange, all say there won't be a problem with increased turnaround time for issuing pre-LEIs, with WM's Uwe Meyer telling us that times will even shorten significantly as a result of work that its utility is doing.
That remains to be seen. The publication date for this issue precludes us from capturing any immediate fallout from the EMIR deadline, but the industry is undoubtedly monitoring the issue.
A few service providers, namely Bloomberg, Swift and Calypso Technology, all added EMIR reporting capabilities or updated their offerings to handle EMIR reporting. Bloomberg's service channels the data directly to the DTCC or REGIS-TR repositories. Swift developed its FileAct messaging service to send data to the DTCC, UnaVista and REGIS-TR in similar fashion, and is also offering to channel data from its existing Swift FIN service to REGIS-TR. Lastly, Calypso, a smaller organization, altered its Global Trade Repository previously developed in response to Dodd-Frank Act requirements in order to address EMIR needs. Clearly, the EMIR deadline is also making itself felt among the services provided by such companies, along with LEI-related utilities.
Elsewhere, Nicholas Hamilton explores a new area of interest for data management professionals-the use of wikis, the collaborative content engines exemplified by Wikipedia, for financial reference data purposes. With the sensitivity of proprietary data information and the need to keep market and reference data close to the vest for competitive advantage, it's no surprise that the financial industry has been slow to embrace the use of wiki technology. Progress recalls the slow and wary take-up of social media in the industry, if it is not prohibited completely.
Like many choices, using wikis is a risk-reward proposition. There's the risk of sharing information, but also the reward of obtaining otherwise unavailable data. As GoldenSource's Steve Engdahl points out, one might need to be careful about plugging wiki data into critical business systems. Wikis can “self-enforce” for data quality, Engdahl notes, but the amount of financial services community wiki users participating may not reach the critical mass for quality assurance that exists in other businesses. If wikis can gain a foothold anywhere in financial services, it's most likely to be in operations and control, says UBS's Uday Odedra. That, however, is predicated on overall management becoming more data-centric.
While at Sibos Toronto, James shares some interviews covering topics on blockchain, fintechs and cybersecurity.Subscribe to Weekly Wrap emails