Anthony Malakian: Getting Schooled in Data Quality
The coolest thing about this job is that I get to learn new things every day. Prior to joining the world of financial technology, I was a sports journalist. I don’t feel that I’m blowing my own horn when I say that I am a certified genius and that my opinions are infallible when it comes to sports—you know, just like every other sports fan alive.
But I have to do my homework to do this job. And the best part about that is that I get to have true geniuses in the field school me. Some people scoff when they hear that I left the world of sports for this. What they don’t understand is that it’s far more stimulating for me to interview very bright people and take away a new perspective.
For instance, this month I wrote a feature on data quality (see page 18). While I do miss chatting with professional athletes, talking to C-level execs from RBS, Fannie Mae and Rohatyn Group requires that I bring my A-game every time, lest I sound stupid—which has probably happened more times than I’d like to acknowledge.
I’m two-and-a-half years into my career covering Wall Street and there are many things that I feel I have a strong grasp of—and data quality management was not one of those things, as we tend to leave that to the pros over on Inside Market Data and Inside Reference Data. But after having spoken with some truly intelligent people—and not just from the banks and hedge funds, but from the vendor and analyst communities—I have some thoughts on why data quality remains an elusive end-state.
This is truly an issue that starts at the top. As Rohatyn Group COO Lee Bocker told me, “When technology is developed and imposed upon business it doesn’t work—it’s destined to fail.”
Something like data quality management isn’t a quick fix; it’s a long—at times painful—process that requires constant investment and manpower.
Furthermore, the non-IT C-level decision-makers have to follow through and keep on monitoring the situation to keep the ship on course. It is all too tempting to want to throw a project at IT and say, “Fix it,” but without a clearly stated direction, the project will fail. And something like data quality management isn’t a quick fix; it’s a long—at times painful—process that requires constant investment and manpower.
These aren’t my words; these are the words of many men and women who are far smarter than I am.
Standardization is another important issue affecting this industry, but I keep hearing contrasting views on this point. The industry wants standardization—but what one wants and what one is willing to actually fight for can often be at odds. I want an Apple iPad—it would be good for me both professionally and personally—but I am not willing to patiently save for such an expense.
The Legal Entity Identifier (LEI) is a great push that will help. And as Scott Marcar, RBS’ head of risk and finance technology, notes, rules around central counterparty clearing will help to improve the system as a whole, as well. But even Marcar says that we’re “a million, million miles away” from the industry adopting a common data architecture and lingua franca that can be adopted across the industry.
Efforts toward standardization have been excruciatingly slow, despite the light that was shone on the issue after the financial collapse that took hold in 2007 and 2008. Firms seem to know this is a problem, but are still struggling with how to address it.
According to a recent SimCorp survey, which polled nearly 100 buy-side firms, many are still struggling to improve their data systems. The survey found that 40 percent did not have confidence that the data they receive from disparate systems is “consistent and of high quality.”
The upside of this is that two-thirds of respondents said there was a “significant effort” under way to “reconcile data between disparate systems and sources.” But this is tempered somewhat by the fact that one-third called efforts at their firms “minimal.”
The scary thing about 33 percent of respondents answering “No” is this: I fully believe that firms that are truly able to efficiently reconcile their data probably answered “Yes” to that question; it’s either the laggards or arrogant or delusional ones that likely answered “No.”
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Printing this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Copying this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
Hub to lay off 20% of staff, sources say
Hub’s CEO says this is simply a case of a startup trying to stay nimble and efficient; others say it points to deeper issues.
TS Imagine integrates LTX’s pre-trade analytics tool
Users of the fixed-income EMS will now have access to LTX’s Liquidity Cloud tool, which provides a pre-trade score for the likelihood of trading success.
After contentious Opra upgrades, vendors brace for a faster future
Upgrades to the datafeed widely used to gauge the current market price for options contracts went into effect in February after three separate delays, which market participants say were caused by persistent bandwidth issues at some important recipients.
The IMD Wrap: No more turf wars, or why CDOs should heed the Voice of the CTO
Max reviews how our recent Voice of the CTO series has implications for those beyond a firm’s technology function, and how communication and collaboration between tech, data, and leadership will deliver better results.
Dark horse: Deutsche Börse building dark pool
New functionality allowing exchange members to execute sweep trades comes hot on the heels of European rival Euronext launching its own dark pool.
Man Group’s proprietary data platform is a timesaver for quants
The investment firm’s head of data delves into its alt data strategy and use of AI tools to boost quant efficiency.
Waters Wrap: The tough climb for startups
Anthony speaks with two seasoned technologists to better understand why startups have such a tough time getting banks and asset managers to sign on the dotted line.
As crypto ETFs become reality, benchmark providers take center stage
The SEC’s approval of the first spot bitcoin ETFs will expose a growing number of traditional market participants to the maturing world of crypto data, a moment that some—such as CF Benchmarks, BlackRock’s benchmark provider—have been eagerly awaiting.
Most read
- Women in Technology & Data Awards 2024: All the winners and why they won
- Witad Awards 2024: Above and beyond award (vendor)—Susan Bennett, Tradeweb
- Fighting FAIRR: Inside the bill aiming to keep AI and algos honest