Supply Chain Fundamentals

Sometimes in analyzing and seeking to pull insights out of what's being said about various data management issues and ideas, I look for innovative developments and how they are being handled, such as cloud computing resources (see ‘Cloud Choices') or big data (see ‘Quest For Data Quality').
However, sometimes, one has to look at more prosaic or basic parts of reference data management, to shed light on their importance in improving the functioning of data operations. The data supply chain used by firms is exactly such a topic.
A whole host of data issues all come back to the data supply chain. Any consideration of whether federated or consolidated data models are better for a firm is affected by how data is being sourced.
Where it used to be possible to obtain golden copy data from a single, reliable source, now golden copy is typically an amalgamation of multiple sources. So, again, the data supply chain has to be considered.
The term "data supply chain" itself is often thrown around casually, without clear definition, so it may mean one thing to a professional using it and another to the colleague who hears it. "Data supply chain" can evoke thoughts of several data suppliers being used, but that isn't really it. As John Bottega, the experienced chief data officer who is now a senior advisor and consultant at the EDM Council, says, the steps in the data supply chain really are acquisition, process cleansing, maintenance, distribution and consumption.
In thinking about the data supply chain this way, the processing steps that data goes through are going to affect how data can be handled in federated or consolidated fashion, as well as how multiple sources of data even can be tied together and distributed (most likely internally within a firm) accurately.
If data is being processed differently, or coming from inconsistent supply chains, that can produce discrepancies right from the start. Putting flawed data into the cloud or big data resources can have the effect of dragging down any of the aspirational nature of those solutions for data operations.
More on Data Management
As datacenter cooling issues rise, FPGAs could help
IMD Wrap: As temperatures are spiking, so too is demand for capacity related to AI applications. Max says FPGAs could help to ease the burden being forced on datacenters.
Bloomberg introduces geopolitical country-of-risk scores to terminal
Through a new partnership with Seerist, terminal users can now access risk data on seven million companies and 245 countries.
A network of Cusip workarounds keeps the retirement industry humming
Restrictive data licenses—the subject of an ongoing antitrust case against Cusip Global Services—are felt keenly in the retirement space, where an amalgam of identifiers meant to ensure licensing compliance create headaches for investment advisers and investors.
LLMs are making alternative datasets ‘fuzzy’
Waters Wrap: While large language models and generative/agentic AI offer an endless amount of opportunity, they are also exposing unforeseen risks and challenges.
Cloud offers promise for execs struggling with legacy tech
Tech execs from the buy side and vendor world are still grappling with how to handle legacy technology and where the cloud should step in.
Bloomberg expands user access to new AI document search tool
An evolution of previous AI-enabled features, the new capability allows users to search terminal content as well as their firm’s proprietary content by asking natural language questions.
CDOs must deliver short-term wins ‘that people give a crap about’
The IMD Wrap: Why bother having a CDO when so many firms replace them so often? Some say CDOs should stop focusing on perfection, and focus instead on immediate deliverables that demonstrate value to the broader business.
BNY standardizes internal controls around data, AI
The bank has rolled out an internal enterprise AI platform, invested in specialized infrastructure, and strengthened data quality over the last year.