Data Standards Competition Heats Up

Last week saw a great deal of activity in the data standards model field, or at least a lot of publicity from competing models. Open Data Model, a newer organization than the well-established EDM Council, set out classifications based on asset class distinctions as the basis for its model. Open Data's model is based on the ISO 10962 standard and the same organizing principles as Wikipedia, according to Rodger Nixon, chief executive and founder of Open Data Model.
Open Data Model's membership steadily grew in the latter part of 2011, according to Nixon, who contends that his organization's offering and methods are better and more effective than those of the EDM Council. "There are standardizations of the messages and transactions, but our field isn't transactions, it's reference data," he says. "It's not a relational model."
Multiple classifications provide a basis for better dimensional models for data analytics, states Nixon. "You can see the instruments underlying the derivatives," he says. "You're trying to see the underlying instruments in derivatives and ‘explode' them out [for analysis]. What's your exposure? What are the simplest components, the ones that have an effect?"
Meanwhile, in a webcast by EDM Council, managing director Mike Atkin emphasized the uses of its Financial Industry Business Ontology (FIBO), which is not necessarily or strictly a data model, as he describes it. "It's a formal and factual definition of reality, done by subject matter experts and validated by the industry," says Atkin. FIBO will have three versions, for business entities, for instruments and for loans. Beyond ontology, however, says Atkin, semantic processing will be the future of data management, because it can bridge a gap between simple data dictionaries and ontologies.
The EDM Council worked with the standards body, Object Management Group, on FIBO, and has done a proof of concept for semantic processing using derivatives and FIBO, says Atkin. This effort will allow compliance with regulatory demands to be able to take in FpML, pull data from relational databases, analyze and classify data, and analyze links to counterparties, he explains. The EDM Council's semantic processing plans show promise for pulling together different data standards. The specifics of these plans include implementation in XML format and possibly metadata annotations for derivatives, as Atkin mentions.
The question that the positioning by Open Data Model and the EDM Council raises is which approach is the right way to go for data standards modeling? Is it emphasizing organization or classification by asset class? Or is it linking the proverbial alphabet soup of acronym-named formats together? I'd like to hear your thoughts on this. Please post replies here, or you may also respond via Inside Reference Data's LinkedIn discussion group, where this column will also appear.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Regulation
Friendly fire? Nasdaq squeezes MTF competitors with steep fee increase
The stock exchange almost tripled the prices of some datasets for multilateral trading facilities, with sources saying the move is the latest effort by exchanges to offset declining trading revenues.
Europe is counting its vendors—and souring on US tech
Under DORA, every financial company with business in the EU must report use of their critical vendors. Deadlines vary, but the message doesn’t: The EU is taking stock of technology dependencies, especially upon US providers.
Regulators can’t dodge DOGE, but can they still get by?
The Waters Wrap: With Trump and DOGE nipping at regulators’ heels, what might become of the CAT, the FDTA, or vendor-operated SEFs?
CFTC takes red pen to swaps rules, but don’t call it a rollback
Lawyers and ex-regs say agency is fine-tuning and clarifying regulations, not eliminating them.
The European T+1 effect on Asia
T+1 is coming in Europe, and Asian firms should assess impacts and begin preparations now, says the DTCC’s Val Wotton.
FCA sets up shop in US, asset managers collab, M&A heats up, and more
The Waters Cooler: Nasdaq and Bruce ATS partner for overnight market data, Osttra gets sold to KKR, and the SEC takes on DOGE in this week’s news roundup.
Waters Wavelength Ep. 312: Jibber-jabber
Tony, Reb, and Nyela talk about tariffs (not really), journalism (sorta), and pop culture (mostly).
Experts say HKEX’s plan for T+1 in 2025 is ‘sensible’
The exchange will continue providing core post-trade processing through CCASS but will engage with market participants on the service’s future as HKEX rolls out new OCP features.