Apologies that this column is coming out a day late. As you all (hopefully!) know, last week was the beginning of our inaugural WatersTechnology Innovation Exchange, a three-week event that concludes on September 22. (If interested, you can register here, and it’s free for end-user firms. All panels are available on demand up until the 22nd.)
So this week on WatersTechnology.com, we’ll have a bunch of stories going up from the event, but first we need to get caught up on some news. Lots to get to, so let’s get to it.
Schroders Goes Negative
If you’re in the mood for an interesting case study, Josephine Gallagher spoke with Emmanuelle Mathey, the firm’s head of credit risk, about how the asset manager is using vendor Owlin to identify negative news sentiment linked to credit risk.
Essentially, Mathey’s team was previously using a different vendor that would send the credit risk team a news dump via email every day. The six-person team is responsible for identifying risks across 420 parent companies and 850 underlying legal entities. Especially for the biggest banks and brokers, that’s a flood of news coming in on a daily basis. Owlin was able to compress that information and visualize it for Schroders, proving their worth after a three-month trial.
The highlight of the product, though, is that it can bring forth negative sentiment in an article, which is most important for determining credit risk, she says. The moral of the story is that when it comes to incorporating new alternative datasets or using analytics platforms, don’t be afraid to cut bait. If the service isn’t delivering what you need, move on.
Open Source Comes to the Forefront
The good thing about “moving on” from one vendor to another is that it has never been easier.
First, capital markets firms are increasingly embracing the major public cloud providers. And thanks to the expansion of cloud tools and platforms moving to the web, tech firms and end-users alike are leveraging APIs to deliver their services. When combined, these two movements are helping to push along the desktop application interoperability movement in the capital markets. And, finally, while banks had previously bristled at the idea of using open-source tools, much less contributing to these communities, that ethos is quickly changing.
Additional proof of this movement came this week: As WatersTechnology first reported, Morgan Stanley is contributing Morphir to the Finos community. Morphir allows software developers to achieve more functionality while writing less code. The code it creates can be implemented within another platform, regardless of the codebase it’s written in, Andrew Robinson, head of funding and finance technology at Morgan Stanley, told Reb Natale. In addition to Morgan’s contribution, EPAM Systems is also contributing its data model GLUE to Finos. GLUE is part of EPAM’s Wave ecosystem for asset and wealth managers to the community.
The Morgan Stanley contribution shows how the biggest investment banks are not just “taking” open-source tools, but also making them. Again, that was not always the case. And the EPAM piece shows how Finos is trying to expand its reach into the buy side, which has proven difficult for the foundation.
As Finos has built its own foundation of members from the capital markets, the organization is now looking to expand following its induction to the Linux Foundation earlier this year. As we previously reported, Finos is looking at the natural transition to the retail banking space—since they’re already having success at bringing on the investment banks—and they’re carving out a niche in the regtech world, as well.
The beauty of open source, though, is its collaborative nature. At the turn of the century, banks and asset managers were all about the “proprietary” lifestyle, adhering to the secretive mores of Wall Street. The Financial Crisis then pushed firms to meet newly-mandated requirements and incorporate new regulatory reporting and data cleansing/governance tools. Suddenly, everyone was dealing with the same problems at the same time, and, to some extent, proprietary projects took a backseat—while cloud, APIs, interop, and open source made vendor partnerships easier and more palatable.
As a result, today what you see now is companies like Schroders using vendor technology, but then customizing the product in-house to make it suit the specific needs of the credit risk unit. Or, as we reported recently, BNP Paribas Asset Management is partnering with a specialist in the field of natural language processing (NLP) while weaving in its own machine-learning expertise to find sentiment indicators in news reports to forecast company returns.
And then jumping back to Finos, they, too, want to ally themselves with a wider array of firms. Analytics tools, for example, that are useful to capital markets firms can also be useful for companies in the sectors of healthcare, transportation, real estate, or government—and vice versa. While speaking with Reb, Gab Columbro, Finos founder and executive director, indicated that they are clearly thinking about ways to expand the foundation beyond the walls of Wall Street.
“While I’m not ready to make any formal statement as to where Finos is going to be, we certainly are seeing that the journey and the successes that we’ve experienced can be applied to other adjacent fields for sure,” he told Reb.
Finos’ flirtations with other industries reminds me of another company that knows Finos well—Symphony. A few weeks ago, Reb broke the news that Symphony was looking to expand into the KYC/AML space as part of an ongoing rebrand.
The move was surprising to some, as a spokesperson at Refinitiv, when asked about Symphony developing a potentially competitive product to its own KYC/AML managed service, said, “I think this is a smaller AI-focused provider of AML services, so we wouldn’t comment on their launch. Don’t think it’s the messaging service [provider] moving into AML.”
I speculated that this move into KYC is part of a wider strategy for the collaboration-tools provider to expand beyond the capital markets, as there have been rumors (that I’ve heard, anyway) that the company would like to start cashing in on the healthcare market with its comms-and-collab suite of services.
That’s what the democratization of technology does—it opens up the field of potential clients for a vendor or, conversely, it opens up the universe of vendors that can be brought in house at a bank or asset manager.
The only thing for sure is that lines are going to blur when it comes to “fintech.”
Trust Me, Blockchain is the Cure
When Max Bowie pitched the idea of writing a feature about zero-trust architecture, I said, “What the hell is that?” Well, a couple months—yes, months—later, Max has written the definitive piece about ZTA in the capital markets. You can read his opus here.
As a very quick explainer, zero-trust architecture offers firms the ability to lock down access to sensitive data and systems, by assuming that no inherent trust exists between systems or users just because they are on the same network. As more and more individuals will look to work remotely, ZTA is likely to take on greater importance, as Max explains. But in reading about these architectures, the word “blockchain” kept leaping to the front of my mind…and I’m never happy when I have to think about blockchain.
The thing about ZTA is that it’s not a specific type of technology—it’s an architecture, and as such, firms must decide for themselves which tools will achieve their desired results. And you know what happens when there’s ambiguity? Blockchain will step in through the confusion. Mark Kovarski, co-found and CTO of startup incubator and accelerator Alegious Innovative Partners, told Max that blockchain technology can be used to validate and authenticate users and end points trying to access systems and data. Also, blockchain’s whole sales pitch is an immutable distributed ledger that’s supposed to be tamper-proof.
I’m skeptical, as always, when blockchain is involved. But if the effort put into blockchain development can be used for something good like ZTA, I think it’s a premise worth exploring.
The real question with ZTA is whether or not banks and asset managers will fully embrace ZTA. While the tech around ZTA is improving (and that’s totally exclusive of blockchain) it can still be cumbersome for those who want to move quickly and break things—or for those who simply need to put out fires.
As one market data technology executive at a major US bank told Max, their firm’s ZTA program forces staff to obtain multiple levels of permission each time they need to access servers running data systems—even if their job is maintaining those systems on a daily basis—which can result in routine tasks taking up to 20 times longer than before.
Right now the case for ZTA is strongest because firms will need to figure out how best to create a secure, remote workforce. Even if the ultimate plan is to have the majority of traders and portfolio managers back in the office, it’s unlikely that most will go back to pre-Covid work-from-the-office staffing levels. ZTA can be the answer—and I’m sure those in the blockchain space will be more than willing to assist.
Innovation Exchange Quote Check
In next week’s column I’ll look to connect the dots between the various panels that we’ve had at the Innovation Exchange. Unfortunately, I didn’t have time to listen to many panels, but I’ll get caught up over the next few days. So, for the time being, I’ll leave you all with some quotes from the event pertaining to the field of artificial intelligence. (Thank you to Wei-Shen Wong for transcribing these…a total lifesaver!)
Laura Hamilton, managing director, Bank of America Merrill Lynch
“All of these models are dependent on data, whether they’re using artificial intelligence, machine learning, deep learning—all of these require a good set of data, right? So, and quite frankly, all models, even if you’re writing your models and you’re being very deliberate through your model methodologies, like whether you’re using R, or Python or MATLAB, or any of these modeling languages. So the data is definitely going to be an impediment for any kind of model.”
Michael Natusch, global head of AI, Prudential Plc
“The truth is the data is always inconsistent, outdated, contradictory, dirty, and it’ll never be better. But the reality is also the humans are making decisions based on that data. So it’s 2020 now and we have a whole bunch of tools that allow us to make sense of that data as well. If we just think about how the human makes a decision, that’s a good guidance for us to at the very least start to be better in the same point.”
Eric Tham, senior lecturer and consultant, data science practice, at the National University of Singapore (NUS)
“To understand how AI works in finance, there’s a need to understand cross-disciplinary finance and how AI models work … Otherwise, AI will purely just be curve-fitting, and would just be trying to find relationships. That has been raised by many AI experts and the thought was that if it’s just curve-fitting, soon AI could enter another winter, unless [firms] infuse into AI models the understanding of financial theory, such as stochastic calculus and behavioral finance contextual memory. These are some of the fields in finance that could help explain how AI models work.”
Kirill Petropavlov, director for AI innovation at Bank of Singapore
“I think banks, actually, even more than any other organizations, are focusing on being able to explain things properly because we have to do that, right? … But I think it’s important to mention that, that’s why progress of AI in banks is actually much harder and takes much longer because, we cannot just go full-experiment mode and just let it run and use customers to experience this model, and just hope for the best, right? So we have to do the really hard work to bring it to the point where we can say, ‘Okay, I know what it’s doing; I know why it is doing [that]; and I know that it’s doing the right thing.’”
Emine Yilmaz, professor and Turing Fellow at the department of computer science at University College London
“Especially for the cases where AI is being used for critical decision making, I think development and explainability have to go hand-in-hand. I don’t think people will start relying on AI or trusting on AI in the long run, if we don’t give proper explanations to people.”
The image at the top of the page is Georges Seurat’s “Café-concert” courtesy of the Cleveland Museum of Art.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact [email protected] or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact [email protected] to find out more.
You are currently unable to copy this content. Please contact [email protected] to find out more.
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Printing this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email [email protected]
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Copying this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email [email protected]
- The Inside Market Data & Inside Reference Data Awards 2020: All the Winners
- Is Low-Code a Movement or a Mirage (Plus the ODRL Gambit & AI’s Afterthought Problem)
- Low-Code Movement Gains Converts, but Skeptics Remain
- Digital Rights Project for Data Usage Faces Legal, Operational Hurdles
- IMD & IRD Awards 2020: Best Corporate Actions Data Provider—SIX