Hedge funds enjoy the ability to be more inventive with data management operations, since they are not bound by many of the legacies large firms carry. Michael asks if the major players can possibly adapt hedge funds’ methods to work within their data systems.
Over the past month, I’ve been thinking about how data quality informs and serves as the reason for other data management trends that we cover in Inside Reference Data, such as data governance, dealing with unstructured data and getting transparency into data sourcing.
We mostly think about these trends, and also about data quality, in relation to how these affect the largest financial services firms. Should the industry, particularly those large firms, be thinking about these data management issues in the same way that hedge funds, with much smaller and leaner operations, think about them?
Marshall Saffer, chief operating officer of MIK Fund Solutions, a New York-based data management software provider with 65 client funds, sees an advantage for his clients over the big firms because they do not have legacy systems. Therefore, they do not need to rip out old legacy technologies that weren’t designed for new data management challenges. “You can help them reinvent the entire idea of data management processes properly,” Saffer says.
MIK has some inventive reporting and sourcing philosophies it applies through its data services. It remains to be seen if such inventive approaches can be migrated into the large major firms, and still be effective.
Broader Data Quality Pursuits
Meanwhile, firms are trying to coordinate data governance plans with their data quality goals, cope with ways to calibrate transparency and link data.
The first impulse that transparency rules are too strict, gives way to the realization that they are a means to achieve higher data quality.
Northern Trust’s Brian Sobolak advocates using data governance plans to direct resources in pursuit of data quality, and GE Capital’s Roberto Maranca says that data governance plans should point data consumers to “where to find data in the best, most trustable and highest-quality format.”
Along with governance, the provenance and sourcing of data can affect data quality. Scott Preiss of CUSIP Global Services puts knowledge of where data originated and links to primary sources as key. A singular data system may collect data from many sources, as Thomson Reuters has done, but stay consistent by managing the information around a core set of entities that are universal for all content.
In our coverage of transparency in price discovery for benchmark data, Jim Binder of MarketAxess, a fixed-income trading and reporting provider, raises concerns about Mifid II regulation in Europe requiring more disclosure that would harm price formation. His colleague, Karl Kutschke at Charles River Development, says the requirement of increased transparency may be disruptive at first, but is useful in the long run. Increased transparency rules did not hurt US fixed-income markets, adds JR Rieger of S&P Dow Jones Indices. In this case, the first impulse that transparency rules are too strict, appears to give way to the realization that these rules are a means to achieve higher data quality.
With efforts to link and evaluate data using APIs and semantics, it’s also apparent that improving data quality is the goal. State Street’s David Blaszkowsky relates that the industry is getting higher quality data because it is moving toward “semantically rich and systematically consistent information.”
All Solutions Great and Small
What could happen if large firms try to apply new approaches like those put forward by providers like MIK Fund Solutions, such as analysis based on securities positions rather than transactions? While the ways that reporting and analysis determine the quality of data, would changes in those methods work within the governance plans employed by large firms? That would require more than business-as-usual thinking. The new methods being used by hedge funds ought to serve as a challenge to large firms’ data management executives—to make it possible to plug such innovations into large operations and change the direction of those operations.
Jesse Lund talks about real uses for DLT in the capital markets, lessons learned while rolling out IBM's blockchain platform, and what’s ahead for 2018, and into 2019.Subscribe to Weekly Wrap emails