Data Standards Competition Heats Up

Last week saw a great deal of activity in the data standards model field, or at least a lot of publicity from competing models. Open Data Model, a newer organization than the well-established EDM Council, set out classifications based on asset class distinctions as the basis for its model. Open Data's model is based on the ISO 10962 standard and the same organizing principles as Wikipedia, according to Rodger Nixon, chief executive and founder of Open Data Model.
Open Data Model's membership steadily grew in the latter part of 2011, according to Nixon, who contends that his organization's offering and methods are better and more effective than those of the EDM Council. "There are standardizations of the messages and transactions, but our field isn't transactions, it's reference data," he says. "It's not a relational model."
Multiple classifications provide a basis for better dimensional models for data analytics, states Nixon. "You can see the instruments underlying the derivatives," he says. "You're trying to see the underlying instruments in derivatives and ‘explode' them out [for analysis]. What's your exposure? What are the simplest components, the ones that have an effect?"
Meanwhile, in a webcast by EDM Council, managing director Mike Atkin emphasized the uses of its Financial Industry Business Ontology (FIBO), which is not necessarily or strictly a data model, as he describes it. "It's a formal and factual definition of reality, done by subject matter experts and validated by the industry," says Atkin. FIBO will have three versions, for business entities, for instruments and for loans. Beyond ontology, however, says Atkin, semantic processing will be the future of data management, because it can bridge a gap between simple data dictionaries and ontologies.
The EDM Council worked with the standards body, Object Management Group, on FIBO, and has done a proof of concept for semantic processing using derivatives and FIBO, says Atkin. This effort will allow compliance with regulatory demands to be able to take in FpML, pull data from relational databases, analyze and classify data, and analyze links to counterparties, he explains. The EDM Council's semantic processing plans show promise for pulling together different data standards. The specifics of these plans include implementation in XML format and possibly metadata annotations for derivatives, as Atkin mentions.
The question that the positioning by Open Data Model and the EDM Council raises is which approach is the right way to go for data standards modeling? Is it emphasizing organization or classification by asset class? Or is it linking the proverbial alphabet soup of acronym-named formats together? I'd like to hear your thoughts on this. Please post replies here, or you may also respond via Inside Reference Data's LinkedIn discussion group, where this column will also appear.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: https://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Regulation
AI’s next gig: The rising cost of off-channel communications compliance
As the cost of analyzing communications increases, what tools can firms deploy to save time and money while avoiding penalties?
CAT on life support after appeals court ruling
Ahead of a comprehensive review promised by the SEC, lawyers believe that the recent overturn of the Consolidated Audit Trail’s funding order could herald its demise.
Euroclear readies upgrade to settlement efficiency platform
Euroclear, Taskize, and Meritsoft are working together to deliver real-time insights and resolution capabilities to users settling with any of Euroclear’s CSDs.
Messaging’s chameleon: The changing faces and use cases of ISO 20022
The standard is being enhanced beyond its core payments messaging function to be adopted for new business needs.
TT partners Thoma Bravo, Fitch launches GenAI solution, AI infrastructure woes, and more
The Waters Cooler: EquiLend acquires Trading Apps, Ultumus and BMLL partner for ETF data and analytics, and more in this week’s roundup.
CAT funding plan struck down by US appeals court
The 11th Circuit court ruled that the SEC had not established a sufficient precedent to pass the costs of the Consolidated Audit Trail on to broker-dealers.
T+1 for Europe: Crying wolf or real concerns?
Brown Brothers Harriman’s Adrian Whelan asks how prepared the investment industry is for the changes ahead, and if concerns about its implementation are justified.
Crackdown on FX vendors could raise costs for dealers
MTF designation could cost aggregators and EMSs $3m to set up and $1m in annual maintenance.