There’s a lot to get to this week, but before I do, a question for you, dear reader: How can I improve this weekly round-up for your needs? Beyond putting stories in front of our paywall (which, sorry, won’t happen), I’d like to know a few ways that WatersTechnology can better serve our subscribers and those of you who are not subscribers, but might be thinking about joining our community. Drop me a line: [email protected].
A Heavy Snow
A couple of days before Snowflake announced it had filed to go public, Reb Natale sat down with the cloud-based data warehousing company for a demo of its service. In the article that followed, Reb explains how Snowflake is targeting asset management firms and exchanges as it looks to get a significant chunk of those Wall Street dollars to justify its $12 billion+ valuation.
Here’s the thing about Snowflake…I haven’t quite figured out how they fit into the capital markets ecosystem quite yet. Now that definitely has to do with my intelligence (or lack thereof), but they sound like they want to compete with the likes of Amazon Web Services, Google Cloud Platform, and Microsoft Azure, while also relying on them in a symbiotic kind of way.
Maybe with the decline of Apache Hadoop, Snowflake makes a good stand-in for data storage that’s cloud-native (where Hadoop wasn’t), but I’ve also heard it positioned as a budding rival against Google’s BigQuery and Amazon Redshift. But because Snowflake is an enterprise data layer that, in order to work, sits on top of those exact cloud providers, I have to wonder: is it possible to compete with a service that’s essential to your own business model? And in the more traditional data warehousing space, there’s already stiff competition from Oracle, Broadridge, SAP, IHS Markit, and other big firms (including Google and Amazon)—as well as numerous financial services-specific companies (IVP, SimCorp, and MIK Fund Solutions jump to mind, but I’m sure I’m missing others).
“Yes, we might have put a nail in [Hadoop’s] coffin, or maybe a few nails in the coffin, but I think that was more of a head fake than anything,” Matthew Glickman, vice president of customer and product strategy at Snowflake, told Reb. “Redshift is good for certain things, and BigQuery is good for other things—we’re good for those things and more.”
Here’s how someone described Snowflake’s offering, and I kind of like it: It’s basically as if AWS, GCP, and Azure, [and I’ll throw in IBM Cloud] have built high-end kitchens that you can use whenever you want. One option is you can do your own grocery shopping, bring the ingredients, measure them out, chop them all up, etcetera. The second option is you can buy the disparate ingredients already prepared by Snowflake. No matter what, though, you still need the kitchen to cook the actual meal.
So what happens when the owners of those kitchens decide that they don’t want to put up with someone else using their state-of-the-art facilities, while also trying to poach customers with their own pop-up kitchens? Or maybe, as usual, I’m missing something here. For the moment, anyway, money talks and everyone seems excited about the arrangement—financial firms, especially.
Snowflake’s roster of clients include FactSet (click here for the full rundown on that partnership), the New York Stock Exchange (which Reb first reported) and MSCI, and CapitalOne, iSTOX, and Crux Informatics are all users, according to various press releases. Additionally, in February, Snowflake announced a $479 million round of funding from the likes of Dragoneer Investment Group and Salesforce Ventures, as well as existing Snowflake investors, including Altimeter Capital, ICONIQ Capital, Madrona Venture Group, Redpoint Ventures, Sequoia, and Sutter Hill Ventures.
So maybe I’m just being cynical, but if I need to be schooled, hit me up at [email protected].
Here’s Another Acronym: FU
For those who don’t know, WatersTechnology (WT) is comprised of about seven older brands: Inside Market Data (IMD), Inside Reference Data (IRD), the ill-fated Inside Data Management (IDM) magazine, Waters magazine (now WatersTechnology magazine), Buy-Side Technology (BST), and Sell-Side Technology (SST), which was formerly known as Dealing With Technology (DWT). Don’t even get me started with that last one’s name.
It was very confusing for subscribers to know why they could read about market data, but weren’t allowed to read about reference data. As the capital markets have evolved over the last decade, the tech and data being used across the sell and buy sides were morphing; reference data and market data (and now alternative data) were not so different in the eyes of end users, who just want all the data.
Also, it was a lot of fucking acronyms, and who needs that?
I bring this up because Jo Wright spoke with the Derivatives Service Bureau (DSB) about how they’re calling for applications from technologists with expertise in cloud-based technology and infrastructure, workflows, and cybersecurity to join its Technical Advisory Committee (TAC).
Now here’s my problem with the world of reference data—there are way too many acronyms. So much time is spent looking to bring standards to the world of reference data, but it’s almost impossible for outsiders to follow along or understand what is happening in this space because every article written about reference data sounds like I’m reading some dense owner’s manual.
In that aforementioned articled about DSB, there were 10 different identifying acronyms used. Those 10 acronyms, when combined, were referenced 89 times by my count in an 1100-plus word story.
Last year, we wrote a very deep dive into Bloomberg’s Financial Instrument Global Identifier (FIGI) and the company’s failed attempt to get it accredited by the International Organization for Standization (ISO), and how that vote exposed the politics that surround standards. It also showed how firms’ unquenchable thirst for cost-cutting ultimately can result in diminished political pull when it comes to market-changing decisions.
It was a good, important story—a story that few, if any, outside of the trades are covering—but if you aren’t part of that world, it was likely a story that got very complex (and even confusing), very fast.
I felt like I should talk about this because while the challenges and debates unfolding in market data and alternative data are widely covered and relatively understood, it feels like reference data is its own little fiefdom where outsiders are not welcome—essentially, “Don’t worry, we know what’s best for everyone and we’ll figure it out.” But on the inside of this fiefdom, there are numerous political battles and, as a result, it feels like change happens in this realm painfully slowly.
I don’t know what a better way forward might look like, but until the jargon used in the reference data community is cleaned up and made more accessible by outside members of the capital markets, nothing is likely to change—and, maybe, that’s exactly the point.
Alt Data’s Emerging Data-Sourcing Economy
Whether it’s geolocation data or vaccine-tracking data or ESG data, investment managers are hungry to get their hands on informative new alternative datasets to better track the effects of the coronavirus pandemic. The alt data space was booming long before Covid-19 changed the world that we all live in, but the effects of the virus have caused buy-side firms to take further interest.
The problem, though, is that there’s a lot of wasted effort and lost investment when incorporating these packages into the investment process, and a lack of standards makes alt data less accessible for many on the buy side.
Where there are challenges, though, there are new opportunities to be found. This week, Max Bowie reported on a consultancy called EOSE Data in London, which has plans to begin offering custom data procurement services for clients in the coming months. EOSE advises brokerages on setting up and structuring data businesses, and acts as a data sales agent for brokers and niche data providers.
The company’s CEO, Suzanne Lock, describes the soon-to-be-launched service as a “data concierge” service that EOSE plans to offer free of charge to clients, and fund by charging vendors a “finder’s fee” if EOSE presents their data to clients.
“We want people to come to us with a wish list. It helps firms’ procurement teams, and it helps our niche provider clients to reach new audiences,” she told Max.
Firms might come to EOSE because they want to find a particular dataset that isn’t from their incumbent suppliers, or which might only be available when bundled as part of a more expensive data package by those suppliers. They might also be looking for a cheaper alternative, or to receive certain datasets through different delivery mechanisms.
As the field of alternative data providers continues to grow, as hedge funds need to find new sources of alpha in an ever-challenging marketplace, and as brokers want to differentiate themselves, this idea of data sourcing seems to be a more viable business strategy in today’s world.
And beyond EOSE, there are several others that have already entered the field recently. White Rock Data Services, which was started by Crux Informatics co-founder Elizabeth Pritchard, was launched last year. Chris Petrescu, a former data strategy executive at ExodusPoint and WorldQuant, started a company called CP Capital earlier this year. In May, AltHub and London-based alternative data specialist Alqami signed a global data distribution partnership that allows each company to distribute alt datasets sourced by the other. I feel like Neudata also falls into this space, as well.
It would seem that there’s some heat around alt data sourcing, but it’s also important to remember that after the consultants and startups get into a space, it doesn’t take long for larger vendors to start sniffing around, too.
BNP & NLP
In last week’s column, I wrote about how advancements in the field of natural language processing (NLP) are leading banks, asset managers, and vendors to roll out interesting new data analytics services and platforms.
This week, Hamad Ali spoke with Raul Leote de Carvalho, deputy head of BNP Paribas Asset Management’s Quantitative Research Group, about how the firm is looking to go live in Q4 with a new NLP-based model that finds sentiment indicators in news reports to forecast company returns.
“We are not yet big users of text and NLP, but this is an area we have been developing recently, and where we expect to deploy some models,” he said.
What they’re working on is quite interesting, so I’ll leave the article to explain all of that, but here’s the key takeaway for this space: Leote de Carvalho’s quant team is building the machine-learning models to generate the sentiment signals, but the NLP work is being done by an (unnamed) vendor.
This is an important distinction: in the capital markets, while banks and buy-side firms are more than happy to hire intelligent engineers, data scientists, and quants with a wide array of skills, the NLP portion of these projects is still largely left to the vendor domain.
This is not to say that financial firms aren’t doing interesting things in-house with NLP (UBS Asset Management and Brown Brothers Harriman jump to mind), but for the general NLP tools starting to stream through the industry, the specialist firms are the ones to keep an eye on. I’m definitely missing some key vendors on this list, but here’s a small group of sophisticated NLP vendors: Causality Link is doing cool things in the ESG space; Liquidnet went out and bought OTAS Technologies because of how cutting-edge the company is in the field on NLP (and it’s now part of a new Liquidnet business unit called Investor Analytics); in the field of voice, GreenKey Technologies is top of class; and AlphaSense has won several of our AI awards thanks, in part, to its NLP prowess.
I am ALWAYS interested to hear about new, under-the-radar AI firms—as I’ve said before, fire me an email if there’s someone that WatersTechnology should know about.
That’s all for this week, see you next Sunday.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact [email protected] or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact [email protected] to find out more.
You are currently unable to copy this content. Please contact [email protected] to find out more.
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Printing this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email [email protected]
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Copying this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email [email protected]
Garry Cooper, CEO and co-founder at Rheaply, joins the podcast to discuss the circular economy and how technology plays a part.Subscribe to Weekly Wrap emails
- The future of algo trading: Using deep learning to more accurately predict equity market volumes
- AFTAs 2021: All the Winners
- Buy-Side Technology Awards 2021 Winner's Interview: IHS Markit (Corporate Actions)
- Regulators turn gaze on ESG rating providers—for better or worse
- Waters Wrap: An EU consolidated tape—a story of market data costs & reality