Security, Accessibility and 'Holding a Mirror to It'

Reflecting on one of the year's biggest questions, technical and political.

bourgaize-murray

In the wake of recent events in Paris and elsewhere in the world, Tim takes a look at the technological side of an eminently contemporary problem: in the markets, just as with public safety, should we design data hubs and communications platforms with a 'back door' in mind?

Forewarning: this will be a bit more of a think piece than usual. The past few weeks' events — in Paris, Beirut and over the Sinai desert sky — deserve a week's departure from the norm.

Of course, the sickening brutality we've seen in these places compares in no way to a financial technology problem; never could. As a coincidence, Waters profiled three Paris-based executives consecutively earlier this year — from Euronext, Tobam Asset Management, and Credit Agricole. We regularly dial +33 to speak with them and many other sources for our stories and we wish them, their colleagues and everyone affected by the global series of attacks only safety and recovery.

The Tech Vector...

One of the things I've heard analyzed repeatedly — though not enough, considering the unfortunate rhetoric in the United States regarding the challenge of accepting Syrian refugees, instead — is how such events could be organized and coordinated without authorities knowing it. The phrase that's been used is "going dark."

The dark web — part of the deep web, which simply describes the corners of the internet that aren't indexed by search — is nothing new. But it has grown significantly since the Sept. 11th, 2001 attacks both in its depth and indeed its accessibility.

Data — like a chat message — is an indicative signal of intention and, many times, of action. Acceding to having data (or personal communications) stored and available in one place is clearly a risk. But the considerations around that risk are also constantly evolving; there comes a point when old paradigms should take a back seat.

You could perhaps even count mobile chat applications that scramble messages in transit, delete them after only a few seconds, or both as part of its outgrowth. One of the platforms we've seen identified as a problem in the Paris attacks is Sony's Playstation console ... proof enough that we're not talking about sophisticated criminals using non-commercially available technology here.

To this point, authorities are beginning to ask why the manufacturers of these devices — Apple, Google, Sony among other giants in Silicon Valley — can't 'bug' their products for surveillance purposes as a matter of design.

It's a question financial regulators have asked throughout 2015, too. Witness the still-ongoing Libor rigging scandal in London, much of it based on chat messaging; Senator Elizabeth Warren's queries around Symphony's archiving ability; blockchain's mutability; and all sorts of cyber considerations around intrusiveness, training and insider threats.

One could easily argue this is a defining question for technology development of all kinds in 2015, and certainly it is in the markets: Forget about the user interface, scalability, or functionality. What about a back door?

The collateral damage these days is proving that "just keep a record for us if we need it later" is simply insufficient. But, just as in society at large, the question of how to do it is harder.

Catching Up

A double feature I published this week about the syndicated loans market touched on both sides of the problem. This is one space where technology and its wherewithal is still being shaped, and therefore lays bare some of the psychology at work.

Loans are a trillion-dollar market that doesn't have a genuine, unified electronic record of the trading activity occurring within it at any given time ... hard though that may be to believe. There's no back door, because there isn't even a house in this case. It's an extreme example of information security.

While many participants argue automation and transparency should be brought to bear in different parts of loans' workflow to speed things up, many also admitted to me that they like this particular aspect — its relative opacity — the way it is.

Some players argue that "holding up a mirror" to loans trading, as one source put it, isn't merely difficult because of the relative paucity of tech currently in place; it's also unpalatable from a business perspective because there are so few counterparties to work with, and so few ways out of a position if you're discovered and exposed. Therefore, hold your cards close to the chest.

That's a contributing reason why one such "mirror", an STP-enabled settlement mechanism called Markit Clear, is still being debated over rather than fully operational.

Indicative Signals

Of course, this isn't a problem for loans exclusively. Far from it. Swap data respositories (SDRs) are facing similar issues. And really, it's the same list of crucial questions for every new trading venue and any centralized data hub or ledger today: Who is permissioned and who gets to see what? What participants are identified? What built-in oversight is given to regulators, and how clear (or not) are the securities laws governing trading? How secure is it?

Now, perhaps we should add, how secure should it be?

These are all fair concerns to balance. Data — like a chat message — is an indicative signal of intention and, many times, of action. Acceding to having data (or personal communications) stored and available in one place is clearly a risk. But the considerations around that risk are also constantly evolving; there comes a point when old paradigms should take a back seat.

If we're going to have an age of surveillance, and we already have one to a large extent, let's make sure it's working for us as well as possible.

Whether in financial markets or as part of broader society (and obviously one is more important than the other), that means defining and confronting these core questions head on, rather than ignoring them.


 

 

  • LinkedIn  
  • Save this article
  • Print this page