Alternative and ESG datasets hold the promise of delivering better and more predictable returns for investors, but are some firms underestimating the amount of work required to integrate these into their strategies?
Investment firms are increasingly seeking out sources of ESG (environmental, social, and governance) and alternative data to deliver alpha, despite the associated costs and practical challenges of acquiring, managing and integrating the data into their investment strategies, according to a panel at the North American Buy-Side Technology Summit, hosted by WatersTechnology.
“By using alternative data, you can get so much more [insight]… than the limited information that a company makes available in its financial statements,” said Mike Chen, senior portfolio manager at PanAgora Asset Management. “If others are using it and you’re not, then… you’re being left behind.”
According to an audience poll conducted at the event, 50 percent of firms in attendance are already using alternative data to support investment strategies, and another 30 percent are planning to incorporate it into their strategies.
Portfolio managers are moving away from traditional styles of investing and are looking at how to “recapture that alpha,” said Lisa Conner, head of client services for North America at Rimes. “We’re seeing a lot of questions from our clients about ESG.”
This reflects a shift in the composition of the economy worldwide, moving from manufacturing stocks to a point where the largest US stocks include Google, Facebook and Amazon, whose assets are more IP-based and less tangible than a company manufacturing products. To measure the performance of these companies, one must look at how efficiently a company is run—and ESG factors such as pollution add value to that assessment, Chen said.
Other panelists noted that companies with better ESG practices tend to be less volatile than their rivals, while Dessa Glasser, principal of the Financial Risk Group, who moderated the panel, noted ESG-focused companies also tend to manage their risk better and be better managed overall.
However, Chen warned that firms shouldn’t try to make their strategy fit the latest cool datasets, but should take advantage of them only where they fit a specific investment need.
“For example, if we have a pharma strategy… we come at it as an investment question first, then look for datasets,” he said. “We try to think like an executive in an industry, and we ask ‘How would that help me?’—then we can see where that data would be useful to us.”
Then, Panagora evaluates any ESG dataset in the same way that it would treat any other alpha factor.
“It goes through rigorous testing, just like any other type of research. One factor we assess is ‘materiality’—whether an ESG factor is material to a company’s core operations. For example, carbon emissions for industrials or material manufacturers. And we find that if a company improves its carbon footprint, its overall performance tends to improve, too,” Chen said.
Refinitiv (the former Financial & Risk unit of Thomson Reuters), which aggregates sources of alternative data, also assesses each dataset. “We look at how strong the signal is, and what is the duration of the signal—for example, is it sentiment for day trading, or something that can give you insight over a calendar quarter, or longer term,” said Mahesh Narayan, head of portfolio management and research at Refinitiv.
But obtaining data in a form suitable to conduct that research and testing can be a challenge in itself, including the application of machine learning and artificial intelligence tools to ensure data is properly tagged so that it can be correlated to market data, or that transcripts are correctly parsed so that accurate sentiment scores can be applied, Narayan noted.
“Machine learning is the base layer of what we’re doing—it doesn’t matter to customers. But what does matter to customers is the accuracy of the system,” said Nathaniel Storch, CEO and co-founder of New York-based natural-language processing and sentiment analysis provider Amenity Analytics. “To extract data from human language, you need an enormous amount of compute power and pattern-matching capabilities to extract information. Some platforms top out at 70 percent accuracy. You wouldn’t hire an analyst who was only 70 percent accurate. So we’re geared to delivering human-like levels of accuracy.”
However, managing these data types with the same levels of precision as mainstream market data comes at a cost: “The cost of bringing in data has gone down, in line with the costs of technology, but the legal and contracting costs have not gone down,” Narayan said.
“Yes, the cost of ingesting data has gone down alongside the cost of technology, but… it’s a ‘heavy lift’ to hire the data scientists to handle this data,” Conner said.
Even “free” sources of publicly-available data come with a cost, Chen noted. “We use a range of sources…. Some we collect ourselves—we have very savvy people, and we can scrape sources from the web. But it takes time and is a lot of work.”
Julie Lerner joins to talk about the hemp market and PanXchange's launch of a hemp exchange.Subscribe to Weekly Wrap emails