To a hammer, every challenge resembles a nail. And to an industry excited by the promise of nontraditional, “alternative” data, every new dataset represents an untapped source of alpha. But while it’s easy to swing a hammer, exploiting the value hidden within alternative datasets can be hard work—made even harder by a lack of standards that could make alternative data more transparent and expedite the processes of onboarding and utilizing new datasets.
Specifically, at the same time as demand is growing among financial firms to be able to incorporate alternative data into their investment strategies, frustration is growing at the amount of effort required to identify, acquire, and integrate this data before it can even begin having a positive impact on trading.
“In every conversation I have with buy-side firms now, the issue is that they may have to wait until resources become available at the client firm to be able to run a test,” which can waste precious time, thus negating any potential advantage as each new dataset faces inevitable alpha decay as more firms begin using it, says Bruce Fador, managing director of Fador Global Consulting, which works with new data companies on strategic positioning and their go-to-market strategy.
To address the issue and make it easier for firms to work with alternative data, at the end of last year, FISD, the Financial Information Services Division of the Software and Information Industry Association, created a working group specifically to create standards around alternative datasets that will make them easier for users to consume.
Last year, we did an internal data hackathon, where we had to onboard a whole bunch of datasets within one week, and we realized there were some things vendors could do to make the process easierChief Data Scientist at a large asset manager
Buy Side Take the Wheel
The project was initiated by asset managers frustrated by the barriers to adopting alternative datasets—specifically the process of finding, assessing, and incorporating certain datasets before they can even start using them.
“We’re buyers of alternative datasets. Last year, we did an internal data hackathon, where we had to onboard a whole bunch of datasets within one week, and we realized there were some things vendors could do to make the process easier,” says the chief data scientist of a large asset manager, who also was instrumental in kick-starting the project. “So I started talking to people at other investment management firms, and found that everyone was experiencing the same issues … and that [alternative data] vendors were complaining, too, because each client wanted them to prioritize different things.”
In the first half of 2018, FISD’s executive committee performed a review of industry activities, and found that engaging with alternative data was one of the key topics that members wanted the association to focus on.
“FISD has been aware that alternative data is becoming more important to member firms, so we held alternative data forums in New York and London, and during the cocktail hour following one of these events, a member mentioned that there is a need for alternative data standards, and that FISD might be best placed to develop these,” says Tracey Shumpert, director of member services for FISD.
After holding its first conference call in December, the group has assembled a number of large buy-side firms and defined some high-level goals, with plans to have concrete standards in place and ready for use by year-end.
The working group will initially focus on two key areas that are bottlenecks to adoption and use of alternative data—technical issues and procurement issues—and has divided its efforts into two corresponding streams.
The technical stream has created a summary of the pain points surrounding alternative data that it wants to address, covering areas such as delivery mechanisms, file formats, and metadata, starting with the creation of a standardized vendor “tear sheet” that outlines the attributes of a dataset without consumers needing to pore over different 50-page PowerPoint presentations for each vendor. Meanwhile, the procurement stream focuses on legal and compliance issues, such as the creation of standardized trials and due diligence questionnaires, and is in the process of obtaining feedback from participants on early drafts of documentation to govern these issues.
- READ: Banks are looking to cash in on the alternative data boom, but an in-depth investigation of the alternative data market shows that they may be in for an uphill battle to claim territory. Click here to read.
The resulting standards definitions will be publicly available on the FISD.net website and on Github, and will contribute to broader enforcement of best practices and guidelines. For example, if a new vendor approaches a firm with a PowerPoint deck, the firm can direct the vendor to the standards, so it can resubmit the information in the standardized format, making it easier for the potential client to understand what they’re signing up for, and to compare it in a like-for-like manner with offerings from other providers.
The tear sheet, which should be available shortly, before any other deliverables, would provide a clear description of the dataset and what it covers, as well as details such as how certain data elements are calculated, and how the provider addresses issues such as look-ahead bias, and whether the data is delivered via a CSV or a pipe-delineated file, or in a format that might be hard for firms to import into their databases.
Transparency and Accountability
Fador says it’s important that any descriptive standards are transparent about what a dataset is and does, and what’s required to put it to use. “Is it unique? Is it foundational data, or does it have to be correlated with something else? How will buyers utilize this? How can they deploy it?” he says.
These terms and requirements may be well understood by existing data players. But some of the current disparities arise because the producers of alternative data may have their origins in other industries, and not be familiar with how financial market participants consume data.
“Alternative data vendors are genuinely seeking guidance from the financial data community,” says FISD managing director Tom Davin. “Our members believe standards and best practices will have benefits for the entire industry.”
Response to the working group has been overwhelmingly positive, with numerous other market participants wanting a seat at the table. While the working group will only be open to the buy side initially, it may expand in the future to incorporate other interested parties, such as sell-side firms and vendors.
“I think it’s a wonderful thing. It’s time that someone put a set of standards in place to force providers to present their information in a standardized way,” says Erez Katz, CEO of analytics provider Lucena Research, which recently partnered with events data vendor Wall Street Horizon to create trading strategies that combine alternative data with Lucena’s predictive analytics.
Introducing standards would help eliminate some of the hype around new alternative datasets, and force vendors to present their wares in a way that is easy to understand and to demonstrate, Katz adds. “You hold an analyst accountable for the validity of their report … but there are no rules or accountability on the data side right now, so I think this would be a welcome change for the market as a whole.”
The challenge of alternative data is that it is so new. We’ve probably talked to 600 potential data partners, and onboarded around 60Rich Newman, FactSet
The Marketplace Model
While FISD’s standards should make it easier for alternative data providers to approach consumers directly, traditional data aggregators are also staking their claim to those customer relationships, by creating their own marketplaces of alternative data, standardized around their own existing data, and in some cases, with “sandbox” environments for testing the data before committing to it.
For example, FactSet Research Systems last year launched its Open:FactSet Marketplace of alternative data, and continues to add new datasets, most recently including retail spending data from Mastercard.
“The challenge of alternative data is that it is so new. We’ve probably talked to 600 potential data partners, and onboarded around 60,” says Rich Newman, senior vice president and global head of content and technology solutions at FactSet, adding that the vendor provides additional details for every dataset on the Open:FactSet Marketplace, similar to those being proposed as standards by FISD, including descriptions and details of coverage, as well as historical data.
Newman says the vendor’s decision to build Markteplace was driven by the very same factors driving FISD’s efforts: “We saw firms were taking such a long time to integrate alternative datasets before they can even test or use it … that they didn’t know if they could get it running fast enough to see if it has alpha or can minimize risk,” he says.
Matthew Rawlings, chief data officer for enterprise data at Bloomberg, says the vendor’s own efforts to aggregate alternative datasets—and its recent move to make alternative data available via its Bloomberg Enterprise Access Point online data portal—were also driven by demand from buy-side and sell-side firms.
“From my experience on the investment side, by the time you identify, license, and onboard alternative data, your original investment idea would already have timed out,” says Rawlings, who spent 20 years in senior technology roles at UBS, Standard Bank, JP Morgan, Barclays Global Investors, and Barings, before joining the vendor in 2014. “Alternative data was suffering from the same fragmentation that reference data used to suffer from. Integrating all these alternative data sources together so that clients can get them all from one vendor reduces complexity and cost.”
Before onboarding any new dataset, Bloomberg performs a thorough series of checks to define the data, and confirm that the provider has the legal right to collect and distribute it, as well as other assessments, such as whether the dataset itself is ethical—arguably pre-screening for some of the factors that FISD’s proposed standards would make transparent to consumers.
A major advantage of sourcing alternative data via aggregators is that it is delivered alongside—and pre-integrated with—vendor data that is already widely used by end-user firms, and the aggregator has already done the hard work of identifying, quality checking, onboarding, and integrating new datasets. Or, as Deirdre Sullivan, market development advisor at outsourced sales agency USAM Group, puts it: “Buy-side firms don’t want to hunt and cook their dinner—they just want it on their plate.”
People are tired of being locked into closed solutions … and this could increase your exposure to them. I think this is better managed in a community-driven, dispersed framework, rather than centralized [among big vendors]Chief Data Scientist
Making the data simpler to acquire also broadens the interest in alternative data and opens it up to firms that may not have had the time or resources to incorporate it into their strategies previously, Bloomberg’s Rawlings adds. “For even the resource-rich hedge funds, this is a big efficiency gain,” he says.
However, not everyone is thrilled about—as they see it—the major vendors controlling access to yet another dataset.
“People are tired of being locked into closed solutions … and this could increase your exposure to them. I think this is better managed in a community-driven, dispersed framework, rather than centralized [among big vendors],” the chief data scientist says, adding that the industry-driven environment arguably fosters greater openness and collaboration among natural competitors than might be achieved under the auspices of a prominent vendor. “Initially, I was worried that people would be secretive about what they are doing … but FISD provides a neutral place where people in the industry can come together,” he adds.
FactSet’s Newman, though, echoes what Bloomberg’s Rawlings was saying: They are not looking to create a chokepoint of alternative data and that this will help buy-side firms new to the alt data space to get up and running.
“Every client would probably say they would rather keep us out of it and do the integration themselves, if they could. … But one of our values is the ability to look across datasets,” he says. “There are early adopters of alternative data who have a deep understanding of the subject, but for the other 99%, there is still a lot of educating to do.”
Certainly, historically, while vendors have pursued their own development strategies for interoperability, once a clear need for standards arises within a specific segment of the industry, the effort of defining, codifying, enforcing, and educating the industry at large about those standards has fallen to specialist industry communities and trade bodies, such as FIX Trading Community or the International Organization for Standardization that operate independent from any individual vendor’s agenda. The key here is that any vendor can carry the data, but that its basics should be presented in the same, standardized manner, no matter where consumers source the data from.
“Bloomberg has supported standards for data when customers have asked us to in the past … and so if we were asked to support alternative data standards, we would look at it very closely,” says Bloomberg’s Rawlings.
In addition, the vendor invests a lot of time in ensuring that alternative data providers’ offerings meet its own criteria for data quality, regardless of any official standards. “We coach and partner with data suppliers to make their data clean and complete, and to link it to other identifiers, datasets, and definitions. Bloomberg has a lot of experience in this area. We’re an issuer of Legal Entity Identifiers (LEIs), and also have Bloomberg’s Financial Instrument Global Identifier (FIGI) … and by being an organization that issues identifiers, we’re able to help companies that manufacture alternative data,” Rawlings adds.
How to Make an Entrance
Indeed, this is exactly how new vendors have been encouraged to standardize their offerings to date—either based on the specific requirements of potential customers, or by seeking advice from consultancies that help new entrants target clients in the financial markets.
Feargal O’Sullivan, CEO of USAM Group—which began life offering US-based sales resources to UK- and Europe-based companies looking to win US business without having to set up an office and hire dedicated employees—says he has seen a lot of companies approaching USAM to help solve this problem, and that those seeking to enter the financial markets for the first time often fall short on how they approach sales.
“They might be technically competent because they’ve built systems to capture the data themselves, and they may also use it themselves. But often, they don’t understand how to sell data,” O’Sullivan says. “The most important thing is licensing the data; the second-most important thing is the structure of it, and making it technically easy to use; and the third-most important thing is being able to present use cases to buyers—it really helps to have that when you go to market.”
London-based market data consulting firm Cordatum Associates, which advises companies on go-to-market initiatives for new data products, has also seen demand from new entrants for advice on how to pitch to capital markets clients. Philip Winstone, director at Cordatum, concurs with O’Sullivan’s focus on licensing, but emphasizes that even before that, would-be providers must have a clear and accurate understanding of the worth of their potential dataset.
“New entrants into the financial markets and new entrants into the information side of financial markets—even existing financial market participants, in the latter case—do not understand the value of the data they hold to this specific market,” Winstone says. “It’s one thing to get your licensing right, but if you do not truly understand the value of your assets, then you cannot ensure your commercial terms are maximized right through from product-creation strategies, to pricing strategies, to distribution strategies, to licensing strategies. And that’s not just important for the creators of the data, but for their potential clients, too: If you do not know where your value lies, how can you expect procurers and/or distributors to maximize the benefits of having you as a supplier or partner?”
License to Kill?
But when it comes to licensing data, some sources sound a note of caution. According to one data licensing expert, who asked for anonymity in order to speak freely, broader standards may be required as the industry evolves and as privacy becomes an increasingly thorny issue.
“The big challenges in the long term for alt data providers are ensuring their customers have all the rights they need to use the data in the product … especially where the contributors are scraping the data without identifying and securing similar assurances from their sources—or aggregating data from individuals without getting the informed consent of these individuals,” says the source. “I’d be a bit skeptical about whether a one-size-fits-all set of policies will suit all alt data providers and customers, operating in all regulatory environments.”
Ultimately, standards help different participants to speak the same language, and to make informed decisions based on like-for-like comparisons.
The advantages are that everyone pursues the same deliverables, rather than investing in separate approaches designed to achieve the same thing. The disadvantages include the fact that strategic industry-wide standards can take longer to finalize than individual, tactical approaches, at a time when more and more datasets are hitting the market.
But in the long run, the industry should view standards as an opportunity to make it easier for all market participants—from consumers to vendors—to get more out of alternative data, faster. Failure to seize that opportunity will only make adoption harder and slower.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact email@example.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact firstname.lastname@example.org to find out more.
You are currently unable to copy this content. Please contact email@example.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Printing this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email firstname.lastname@example.org
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Copying this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email email@example.com
Voice of the CTO: Enterprise architecture aims to tie together a bank’s overall infrastructure, systems and workflows. While easily said, bank technologists explain why it’s harder in practice.
A summary of some of the latest financial technology news.
John Ameriks, head of Vanguard’s Quantitative Equity Group, explains the rationale behind dataset selection and how the group has been using machine learning.
The London-based market data provider is aiming to provide higher quality data than its new competitors, but it faces an uphill battle.
The exchange group has migrated its core data to the cloud as part of its 10-year partnership with Google Cloud, and is now looking ahead to further migrations and leveraging the cloud to power new data products.
Voice of the CTO: As banks strive to take advantage of cutting-edge tools, the basics of proper data management and governance are too often overlooked. Banks are learning that a CDO can help gain favor with the C-suite.
The Data Excellence Program—the first to recognize data management at an enterprise level, rather than being an accreditation program for individuals—will help firms solidify data management as a program rather than a series of projects.
There’s a novel kind of data tsunami on the horizon: the volume of new datasets, and trying to discern what’s out there—and how much of it is useful.
- Dora technical standards shoot for break in the clouds
- Tick History – Query: Looking back to the future
- AFTAs 2023: All the winners and why they won
- Breaking down the walls: Enterprise architecture gets its day in the sun
- Interoperability is not AI
- AFTAs 2023 winner's interview: Northern Trust