Opening Cross: Get Ready for the New Age of 'Analatencytics'

I shouldn’t be surprised that low latency—which at the end of last year looked like it was ready for a Lance Armstrong-like fall from grace, after trading firms began publicly questioning the rising cost for ever-decreasing benefits—has so quickly returned to the agenda in 2013, judging by the content of this week’s issue.
Illustrating that firms still place great value on being able to quantify latency and how it relates to execution performance, Corvil will this week announce another deployment of its latency monitoring technology, while San Francisco-based developer 4th Story has built a tool that analyzes algorithmic execution performance against trade data benchmarks, and Wilmington, NC-based appliance vendor Cape City Command has released a free Latency Evaluator tool to identify routing improvements that clients could achieve by using its full-function product.
Meanwhile, data and trading infrastructure supplier Fixnetix has partnered with Netherlands-based network specialist Custom Connect to offer connectivity between NYSE Euronext and Eurex, using a microwave network that slashes latency by up to 40 percent over fiber. This kind of latency reduction is still so cutting-edge that Olav van Doorn, co-founder of Custom Connect, says he only knows of two other microwave networks serving this route—both built by latency-sensitive trading firms exclusively for their own use.
And it’s this exclusivity to the wealthiest players that has led regulators to scrutinize latency and its enabling technologies following high-profile events such as the Flash Crash. A colleague asked me last week whether any moves to introduce a “minimum latency” level—i.e. holding up fast orders, so as to not discriminate against slower traders—would be a feasible move to level the playing field. The concept of holding orders and executing everybody’s trades at set time intervals isn’t new (see the Open Platform from the May 11, 2009 issue of IMD, where Michael Mainelli argues that periodic auctions would reduce latency issues—albeit in the interests of lower power consumption and environmental awareness, since datacenter power consumption now accounts for two percent of global carbon emissions, and many waste 90 percent of the power they take from the grid).
What piqued my interest, though, was how this would affect technology innovation, much of which has been driven by the need for low latency in recent years. Because one might argue that if you set an arbitrary latency “standard,” what incentive is there for firms to beat that time? And from vendors’ point of view, what incentive is there for them to invest large amounts of capital on research and development to produce faster systems?
Actually, there should still be plenty of incentive, though the focus will shift from trying to be fastest because speed alone will win the race, to being faster so that more tasks can be performed within any hypothetical “minimum latency” time—because the more analysis and risk checks you can perform within that prescribed time, the better-placed you will be to not only execute a trade, but execute the right trade.
In short, the competitive aspect will shift from just achieving the best price and hoping to make a quick buck from short-term price movements, to incorporating more fundamental analysis into trading decisions to capitalize on what promise to be the most profitable and least risky short-term movements. So potentially, your competitiveness will no longer hinge only on latency, but crucially, on what you do during that latency—regardless of whether it’s imposed by regulators, marketplaces or by components of the data and trading infrastructure that hit the physical limits of what they can achieve, giving you the opportunity to perform other functions faster and in greater numbers—and how you combine latency and analytics.
See, Lance, it’s not just about being fast—it’s what you do while being fast.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: https://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Emerging Technologies
BlueMatrix acquires FactSet’s RMS Partners platform
This is the third acquisition BlueMatrix has made this year.
Waters Wavelength Ep. 331: Cresting Wave’s Bill Murphy
Bill Murphy, Blackstone’s former CTO, joins to discuss that much-discussed MIT study on AI projects failing and factors executives should consider as the technology continues to evolves.
FactSet adds MarketAxess CP+ data, LSEG files dismissal, BNY’s new AI lab, and more
The Waters Cooler: Synthetic data for LLM training, Dora confusion, GenAI’s ‘blind spots,’ and our 9/11 remembrance in this week’s news roundup.
Chief investment officers persist with GenAI tools despite ‘blind spots’
Trading heads from JP Morgan, UBS, and M&G Investments explained why their firms were bullish on GenAI, even as “replicability and reproducibility” challenges persist.
Wall Street hesitates on synthetic data as AI push gathers steam
Deutsche Bank and JP Morgan have differing opinions on the use of synthetic data to train LLMs.
A Q&A with H2O’s tech chief on reducing GenAI noise
Timothée Consigny says the key to GenAI experimentation rests in leveraging the expertise of portfolio managers “to curate smaller and more relevant datasets.”
Etrading wins UK bond tape, R3 debuts new lab, TNS buys Radianz, and more
The Waters Cooler: The Swiss release an LLM, overnight trading strays further from reach, and the private markets frenzy continues in this week’s news roundup.
AI fails for many reasons but succeeds for few
Firms hoping to achieve ROI on their AI efforts must focus on data, partnerships, and scale—but a fundamental roadblock remains.