Ready to Comply?

Strapline: Golden Copy

michael-shashoua-waters

A common thread of financial firms' readiness to meet challenges emerges among current issues including Solvency II, risk mitigation through data management, and handling of big data.

A common thread emerges throughout a few different issues covered in this month’s Inside Reference Data—the Solvency II rules, using data management to mitigate risk, and the concept of big data—and that is whether securities industry participants are ready, or will be ready, to either comply with regulation or implement initiatives that are being discussed.

In Nicholas Hamilton’s feature on Solvency II, HSBC Securities Services’ Chris Johnson observes that firms still need to figure out “joined-up data consistency,” meaning they cannot cling to competitive secrecy, at least when it comes to providing information in a consistent format. Firms could each try to solve, singularly and without collaboration, regulatory reporting challenges under Solvency II, but they will inevitably produce different and incompatible answers. In the same story, SAS UK consultant Simon Kirby notes that many of these firms are operating in silos, which can also cause incompatible data sets.

Our report on an October 27 webcast on managing risk included a poll of attendees about whether they think their systems are adequate to meet regulatory requirements. Twenty-four percent said they will not be able to meet requirements with current systems. While 64% said they have systems ready and adopted, that number really ought to be much higher—perfection may not be possible, but it seems like the type of criteria that should be above 95%. At least, this state of affairs may help gain support for necessary data management projects, as Nick Helton, global reference data manager at Northern Trust, observed during the webcast.

In this month’s ‘Interview With’, Daryan Dehghanpisheh of Intel details ways the chip-maker is trying to help financial industry firms address big data issues, particularly those caused by new forms of relevant data such as Twitter feeds and Google searches, if one counts those within the range of information defined as “big data.” On this concept, it isn’t a matter of firms being unaware or unprepared, but Dehghanpisheh says firms are trying to build systems with the scalability to keep up.

Inevitably, when the industry has seen new regulatory mandates such as these, or operational developments, polls and surveys at industry conferences, or remarks on panels and in trade media like ours, reveal a sizable portion of firms and practitioners who believe they are unprepared or unaware of how to prepare to meet the new needs. Does the occurrence of this phenomenon depend on what the task or requirement is, or is it inherent to the industry no matter what the initiative?

Finally, don’t forget to visit Inside Reference Data’s LinkedIn discussion group. We will periodically ask for your reactions to these opinions there. If you have an interesting or compelling viewpoint—especially if you disagree—you may have it distributed more widely in future columns and reporting.

  • LinkedIn  
  • Save this article
  • Print this page  

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact [email protected] or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact [email protected] to find out more.

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here: