Max Bowie: The Next ‘Big’ Thing: Big Service

max-bowie
Max Bowie, editor, Inside Market Data

In Setting the Table, Danny Meyer,  owner of Union Square Café in New York, describes an early example of big data in action, and its evolution. He outlines his father’s business running tours of Europe, and how he gained a vast knowledge of people and places, food and drink to satisfy the most discerning tourist’s palette, and built a mental cross-referencing database of flavors to pair the most complex and diverse plates and wines, all of which formed the basis for creating one of the city’s most consistently highly-rated restaurants.

But, just as it takes more than a premise and a price feed to make profitable trades, it took more than that to make and keep the restaurant successful. For example, as a trading strategy in the financial markets evolves by taking note of changing market conditions, Meyer’s staff is constantly collecting and recording data about customers and their preferences through feedback cards accompanying the check, by engaging customers in conversation when they make a reservation, or from their interactions with the wait staff. By knowing whether a diner is a new customer or a regular, what table they prefer, their taste in wine, and what kind of experience they had on previous visits, the restaurant can tailor its service to provide the best experience for each—and make them a more loyal customer and maximize that relationship.

Though big data is a recent catchphrase in financial markets for collecting, processing and analyzing enormous quantities of information that are not officially “market data,” but may nevertheless impact prices and markets, the concept of incorporating a wide variety of different yet impactful datasets isn’t new. Even before the latest generation of news and social media sentiment analysis, a class of traders grew up trading on news and their prediction of how the market will react. Before electronic news feeds, speculators and news barons used the telegraph, carrier pigeons and semaphore towers to communicate quickly across distances so they could take advantage of market-moving news in markets relying on slower methods of communication.

Besides news about commodities or companies, traders realized that macroeconomic and geopolitical events also impact markets in general and specific sub-sets of securities or derivatives. This became so competitive that in the US, announcements of market-moving government figures and reports are strictly controlled, governing the precise time that reporters can transmit stories from so-called “lockup” rooms.

Then over time, the type of news that could have an impactful effect on a company’s price expanded, and hence so did the sources from which traders needed to capture data. The impact of weather on crop yields made weather data invaluable to commodities traders. A gaffe by a CEO in the society pages—or nowadays, a careless blog post or tweet—could spell disaster for a company’s stock price. And so, not only did traders and aggregators need to capture and process unstructured data like new stories, but they also began to trawl the web for sources that others either hadn’t yet discovered or hadn’t learned how to decipher and understand meaningfully.

Even before the latest generation of news and social media sentiment analysis, a class of traders grew up trading on news and their prediction of how the market will react.

Indicators
Other new data sources include analysis of public sentiment expressed via Twitter as a leading indicator of price movement, and behavioral analysis of historical investor activity in response to price movements, such as that provided by vendors such as now-defunct Titan Trading Analytics—itself a lesson that technical innovation must always be done with customer service in mind.

Capturing, formatting, storing and retrieving—then actually analyzing—this data has created a new cottage industry of high-performance database tools, while others still argue about the merits of big data architectures, and whether they can deliver the results they promise for traders. After all, firms may ultimately get the most value not from trying to apply big data to split-second deals, but using it in the way originally intended—to understand that split-second activity and for whom existing inputs are most valuable—and ultimately providing better customer service so that clients don’t turn to a broker or venue because it’s fastest or cheapest, but because it’s the best.

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Nasdaq reshuffles tech divisions post-Adenza

Adenza is now fully integrated into the exchange operator’s ecosystem, bringing opportunities for new business and a fresh perspective on how fintech fits into its strategy.

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here