Brad Peterson’s first brush with high finance was, by all accounts, an exercise in patience.
John Reed, who would become Citi’s storied CEO and chairman, had recruited Peterson’s father from the Ford Motor Company in 1970, the idea being that bringing people from outside financial services into the industry would inject fresh views on how to adapt to a changing world.
But for the younger Peterson, the questions of finance’s complex relationship with technology were far from his mind. “I remember being in Reed’s office, shooting paper clips into the ceiling when I was 12, thinking ‘why is my dad working on Saturday?’ Now, years later, I get it—working in tech is a round-the-clock job,” he recalls. Indeed, little did he know, then, how that scenario would come full circle for him later in life.
Californian by birth, the future CTO and CIO of Nasdaq had always been surrounded by technology, from when his father told him, in the 1960s, that computers would be in cars one day, through to his time at UCLA, where the first node of the internet was being created. Degrees in electrical engineering and post-graduate qualifications from MIT followed, but Peterson first went into the telecoms industry, where he worked on the 2G cellular web standard.
Stints at Charles Schwab, where he built the company’s first WAP-enabled application, as the CIO of Epoch Securities, later acquired by Goldman Sachs, and as the CIO of eBay shortly before its purchase of PayPal came after. He returned to Schwab as its CIO in 2008, before Nasdaq came knocking, five-and-a-half years ago, with the offer of a job running its technology from New York.
A Technology Company
It would have been a hard job for any technologist to turn down—few names are as synonymous with technology in the capital markets as Nasdaq. Now the second-largest exchange in the world by market cap, it was founded in 1971 by the National Association of Securities Dealers, from which it draws its name, with the objective of being the first fully electronic stock exchange. Since then, it’s become the natural home for the tech industry, boasting by far the greatest number of listings for technology companies across a range of sectors.
Stock offerings are only one part of the company, however. It is one of the few exchanges that still truly straddles the divide between market provider and market technology provider, with both aspects being intrinsic parts of its core identity. “I think there are a lot of companies that are saying they’re technology companies, that are not,” says Peterson. “But I would say, for Nasdaq, we check the box. We’re a technology company.”
Exchanges, like banks, are rarely known for their simple corporate structures. Whether it’s the use of somewhat arcane executive ranks—see the London Stock Exchange’s chief of staff role, as an example—or their global distribution, it can be hard to get a sense of how an organization prosecutes its development and use of technology. Nasdaq is no different, but to understand how deep technology runs in the fabric of the institution, it’s important to understand how tech is operated within the company.
As CIO and CTO, Peterson is responsible for Global Technology, and commands a staff of roughly 1,800, including contractors—or approximately 40 percent of Nasdaq’s workforce. He is fully responsible for delivering technology for the internal needs of Nasdaq’s operations, but also to the external-facing arm of the exchange group, known as Market Technology.
This division, acquired through the merger of Nasdaq with Sweden’s OMX Group in 2007, is led by Lars Ottersgård, head of Market Technology. It is responsible for seeing to the needs of all the exchange’s many external technology clients. Ottersgård himself is a study in contrasts to Peterson. Born in the Swedish town of Växjö, he grew up in Eskilstuna, located about 75 miles from the capital, Stockholm, and jokes that he is from an “uneventful background.”
He didn’t go down the traditional route of a college undergraduate degree in engineering or computer science, but had four years of technical education in high school. Instead, he joined IBM, where he spent nearly 20 years in a variety of roles, beginning as a technician, and later moving into sales. He moved to OMX Group in 2006 and has remained with the company ever since, eventually being promoted to his current role in 2014, running the exchange group’s vendor arm.
First of all, we make our list of things that we think are going to be most important for us. And it’s usually from an opportunity standpoint predominantly, and then we also look out for, if we’re too late, which one is going to likely disrupt us the mostBrad Peterson, Nasdaq.
That business is thriving. It supplies banks, brokers, other exchanges, regulators and, increasingly, buy-side firms with a number of technologies, ranging from the matching engines through to post-trade software, and risk and surveillance products, including its popular Smarts platform.
That it has lasted as long as it has is something of an oddity in the world of modern securities exchanges—while in the early part of the Millennium, most venue operators, including the London Stock Exchange, the Chicago Mercantile Exchange, and its then-arch rival, NYSE Euronext, operated vendor arms, most have since pared back their offerings, internalized their structures or eliminated them altogether.
Not so with Nasdaq. According to its 2017 full-year filings with the US Securities and Exchange Commission, Market Technology had an order intake of $292 million, growing from $276 million in 2016 and $271 million in 2015. In addition, the technology sold to clients is broadly the same as that used by the exchange itself, the origins of that decision being a key factor in explaining why it has continued to succeed in this dual role where its rivals have pulled back.
Back in the early 1990s, what was then the OM Group found that it had a problem—it wanted to build out an options trading venue for the Swedish market, but found that there was a distinct lack of suitable software for sale. The decision was taken to build internally instead, and almost immediately, inquiries from other exchanges began rolling in. “The first market that was using our own technology was the Wiener Börse,” says Ottersgård. “And so we built this technology to be deployed in a configurable manner to support many different market models and markets.”
Fast forward to the present day, and it has grown enormously from its relatively humble beginnings, through expansion, the merger with Nasdaq, further acquisition and development. Still, executives say, the focus on maintaining that approach—building in a way that can serve external clients while also solving for internal problems—is key. “That is in our genes, it’s the background for both [OMX and Nasdaq], and we see that building technology for this industry that has been a core competency and then we are really strong in deploying this technology to other markets, or using it for our own markets,” says Magnus Haglind, head of product management in Market Technology, who serves as Ottersgård’s right-hand man along with risk and surveillance head Valerie Bannert-Thurner, and head of sales Paul McKeown.
While Market Technology has its own technologists, it relies almost entirely on Global Technology and Peterson’s developers to fulfill its requirements, driven both by client demand and a need to remain ahead of the curve. Around 1,100 staff directly support Market Technology, he says—which includes 880 from Global Technology.
That, in turn, has informed how decisions are taken to develop technology at Nasdaq as a whole. Whereas when he first joined, Peterson explains, most decisions to develop were driven by internal requirements, it is now a mix of that and requests from Market Technology that inform the direction of future efforts. “I would say it has changed over time,” he says. “So, it was predominantly internal-tech-team-focused and then delivered externally for Market Technology and its clients. Now our approach is bilateral in nature—it can come from both ways, which is great. And some of it is that we’re finding, because we’re a global company with clients and tech hubs worldwide, we can go where the regulators, clients and partners are ready to collaborate with us on new innovations. So we look for opportunities for being able to deploy new technologies both with our external Market Technology clients and our own markets.”
Despite this, the approach to how the exchange engages with these technologies is relatively arms-length in nature. “We operate in a pretty conservative industry, and are highly regulated, so from that standpoint there’s not an interest in rocking the boat too much. However, as a technology company that also runs our own markets, it’s on us to throw new concepts against the wall to see what may stick and evolve capital markets technology to its next phase,” says Haglind, a point echoed by Peterson, who says timing is often critical. “That’s what, most of the time, you get wrong. You have to understand that there’s that whole cycle of where are you on the early adopter curve versus being too late,” he says. “So, first of all, we make our list of things that we think are going to be most important for us. And it’s usually from an opportunity standpoint predominantly, and then we also look out for, if we’re too late, which one is going to likely disrupt us the most.”
Part of this slight reluctance to go the whole hog is due to regulatory attitudes, but also perhaps flavored by missteps from the past, as well. For instance, around the start of the decade, Nasdaq attempted to launch a cloud platform, FinQloud, for the financial industry. However, the key targets weren’t willing to commit at that point. Parts of the technology became other platforms within Nasdaq, and Amazon used some of it to build out its financial marketplace, but the risks of being too far ahead of the curve were evident from that experience. Other attempts to put mission-critical functions on the cloud also fell flat. “We were an early adopter of the cloud, and we were able to get an application for ourselves and our clients through our regulator to do archiving in the cloud. And then we tried to move risk products to the cloud for a couple of the big banks and they were not ready at that point,” Peterson says.
For Nasdaq, the current pipeline of emerging technology, in order of time to impact and maturity, looks something like cloud computing, followed by artificial intelligence (AI), then distributed-ledger technology (DLT), then quantum computing. It is examining all of them, even if, as Ottersgård says, some of the more radical promises of technologies like DLT seem like they won’t materialize any time soon. While it sees promise in DLT, he says, much of the hype surrounding it is “probably a bit exaggerated.”
Machine learning, however, is a different story. The exchange is already using it extensively in its internal processes, but its most visible efforts to date have been through its July 2017 acquisition of behavioral science specialists Sybenetix, and the deployment of machine learning for market surveillance on its Nordic exchanges. In April 2018, it announced, true to past form, that it would be licensing the same technology to Hong Kong Exchanges and Clearing.
“I actually believe that AI will have a bigger impact on the industry maybe than DLT,” says Haglind. “We are really engaged with it and I think with stronger advances in machine power, like with quantum computing coming, it will have an even bigger impact as you move forward. They are becoming available as compute power gets stronger. So you will see the capability of AI increasing and we are working on it in several areas.”
This idea of various components working together, rather than being taken in isolation, with cloud as a foundational technology is a key part of Nasdaq’s thinking when it comes to its modern approach to development. In no area is this better expressed than with the Nasdaq Financial Framework (NFF).
The Financial Framework
We can’t just solve for short-term problems, we have to be the figureheads and the thought leaders for what the industry is going to need in the future.Lars Ottersgard, Nasdaq.
Launched by then-COO, now CEO and president Adena Friedman in 2016, the NFF is a common platform on which all Nasdaq technology runs. Applications are built on top of the framework, with clients able to do the same, rather than building full-stack software packages in isolation. “I think it is a fundamental shift of what we’re doing,” says Ottersgård. “When we started this whole NFF journey, it was to put all of our capabilities on one common stack, to ease integration, to ease data management and so on. But we have, in the past couple of years, come to the conclusion that deployed software is not the future. So now NFF is becoming a platform which can be both deployed on-premises where needed, and as a platform-as-a-service to provide services where needed, whether that’s to our internal markets, or to the external clients we have.”
Indeed, the broader scope of what the NFF could offer is something that has technology executives at the firm palpably excited. Rather than just being a “point solution,” Haglind says, it will form “the backbone of a broader compute and data platform.” Peterson, for his part, describes the potential for NFF to become “a new architectural paradigm,” both within Nasdaq and for external clients.
Possibilities discussed for how this could play out include provisioning full exchange trading systems and data architecture without the need to build expensive physical datacenters to house the matching engines and order-entry systems, or providing new ways to take in and analyze data from trading activities in a range of mission-critical functions. “One of the reasons for this is that we believe in the connected world, where you build ecosystems that can share common platforms, common technologies. So you can do it locally, in your own microcosm, and you can do it regionally,” Ottersgård says. “The ultimate belief is that there’s a need to connect markets across the globe, over time. We see NFF being a foundation, not only to ease the implementation, integration and data management of our different applications, but actually a platform for managed services and solutions, as well as connecting ecosystems.”
Such ambition, of course, requires talented people in order to accomplish—a problem that doesn’t seem to faze Peterson. At a time when banks, buy-side firms, brokers and nearly all other market participants are keen to bemoan the lack of talent available to them, Nasdaq seems remarkably unconcerned about any potential talent drain.
Peterson, for instance, has access to MIT’s externship job postings, which are only open to alumni. Interest in Nasdaq’s projects at the university, from which it recruits its machine intelligence team, has skyrocketed. “We participate in the January externship and summer internship program. We first started that five years ago,” he says. “We posted four jobs and received 40 resumes. The next year we posted 10 and received 80, and now we’ve posted 10 positions and we got 124 resumes. For the January externship program, we are one of the largest hirers, right alongside the big tech firms.”
Discussions about technology at stock exchanges often focus on the speed of matching engines, the cost of market data, or microsecond-level latency reductions, which, while important, can be a sideshow when considering the broader picture of technology development.
Looking ahead, in addition to DLT, cloud, quantum computing and other emerging technologies, including the potential use of augmented reality for surveillance in the NOC, its market control center located in midtown New York, Nasdaq has been keen to stay close to its historic core base—technology companies—in order to inform its future direction. It has also looked to other industries, as John Reed did all those years ago when he hired Peterson’s father from Ford, to help it understand where it needs to take action, and where it could improve in the modern era.
What all of this ultimately comes back to is determining Nasdaq’s future position in an industry undergoing enormous change, both due to market forces, but also due to technological advancement. As the company begins to hit its stride in its dual identity as a vendor and as a venue, Ottersgård says, it also sees itself as being a part of that change.
“It’s like the old saying from Henry Ford, where he said if he asked the people what they wanted, they would have said faster horses,” he says. “We can’t just solve for short-term problems, we have to be the figureheads and the thought leaders for what the industry is going to need in the future.”
Bryan Cross, who heads UBS Asset Management's QED group, joins to discuss alternative data and AI.Subscribe to Weekly Wrap emails
- The Problem Solver: Paul Bari, Nordea
- JP Morgan's FX Algo Tool Launches on Bloomberg Terminal
- Refinitiv Consolidates Data Insights on China's Belt and Road Initiative
- Wavelength Podcast Episode 153: Nasdaq's Lars Ottersgård
- CAT’s Tale: How Thesys, the SROs and the SEC Mishandled the Consolidated Audit Trail