Regulations: Constraining or Stimulating?

Have rules proved too restrictive, or have they yielded positive benefits?

round-table

Do you consider the broad scope of new regulations too restrictive? Or any specific provisions or directives? If so, how have these restricted data operations?

Dilip Krishna, director, Deloitte: The scope of regulation is broad. In several cases, however, regulations overlap significantly in terms of data. As an example, several Dodd-Frank provisions have a heavy overlap with BCBS 239 requirements, including the Volcker Rule, stress testing/CCAR [Comprehensive Capital Analysis and Review] and Enhanced Prudential Supervision standards. This has offered both challenges and opportunities to banks.

When banks have taken a view of aligning approaches and solutions with a broader approach of risk and finance infrastructure, significant synergies have been realized. However, where this is not possible—due to timing and other challenges—the overlap can present significant complications.

alan-morley-gftAlan Morley, regulatory compliance and surveillance practice lead, GFT USA: It really depends on your perspective. From the business side, these new regulations are very restrictive; they've put a stranglehold on banks' capital—dampening their ability to take on risks that are essential to increasing shareholder value. Understanding and managing the downside for taking these risks is an essential element of prudent capital planning. They also give firms a false sense of security that they have "things" under control because stress scenarios and results have become relatively stable over time. While we have a robust structure in place—better than what we had pre-crisis—I am not sure the new regulations will stop a future meltdown. The stress scenarios have become too predictable and lack flexibility.

We have the ability to collect a ton of data, and we've learned how to massage it using internally built scenarios that mimic the scenarios developed by the Federal Reserve. Firms are trying to avoid getting an objection to their capital plans by accumulating enough capital to stay out of the news. While all this is happening, the risk-takers are operating under tight constraints because of the hoarding of capital for any trading activity that is less than investment grade. So, for the business guys, is it restrictive? Yes. That's the short answer.

However, we have an interesting opportunity in the data operations department. These regulations are forcing data operations to do more in controlling their data. There will be added pressure to ensure accuracy and completeness of data, develop golden sources of information, reconcile and attest to the integrity of the data, and ensure there is a robust governance and control structure in place to supply the reporting required by the Fed. The focus on data integrity has put data operations on notice.

dsc-8877-preferJacob Gertel, senior project manager, legal and compliance data, SIX Financial Information: From a data vendor perspective, the scope of the existing and new regulatory requirements actually brings opportunities: from investor protection with respect to too-big-to-fail institutions to infrastructure directives and tax transparency laws and regulations, a data vendor supplies the data and value-added services that support customers to be compliant where they need to be.

What we see is that all of our data is being used to comply with some kind of regulatory requirement, whether it is reference or market data.

Implementing regulatory data and value-added services is very challenging due to the current environment, where numerous regulations have to be implemented simultaneously. This is compounded by the fact that regulators are publishing regulatory requirements but leaving little time for implementation. From a vendor perspective, we have to make sure we start analyzing the draft requirements as soon as they are published—something that will allow us to plan the necessary resources for implementation in close cooperation with our customers. The new regulatory landscape is just a situation that the financial industry has to live with.

alain-robert-dautun-sycomoreamAlain Robert-Dautun, head of risk management, Sycomore Asset Management: After the crisis of 2008, the financial services industry has experienced the most regulatory pressure ever. Regulators from all corners of the globe have clearly stated their preference for stability and to reinforce prudential measures in all financial services. It is in the name of financial stability that they established new mandatory standards, and multiplied provisions to regulate all financial
activity more closely.

However, there are negative impacts from this regulation. First, there seems to be a trade-off between stability and growth, with maybe too much importance given to stability. We now suffer the consequences. Another negative impact is that tighter financial regulation prevents investment banks from being liquidity providers in periods of stress. This would not help stability during the next crisis.

What differences do you see between global regions, when looking at the impact of regulation on data ­operation?

Krishna: One of the key issues banks are forced to contend with is the increased focus on regulating operations within the host country—like the foreign bank—focused rules of the Enhanced Prudential Supervision standards in the US.

For large international banking organizations with integrated operations across multiple countries, it is a significant challenge to carve out their operations in individual countries for the purpose of reporting, for example.

In addition, there are some impactful differences in the supervisory approach regulators take, which can lead to different areas of emphasis that banks have to address in each jurisdiction.

Morley: We see countries now exercising a stronger-than-ever extraterritorial reach and right around customer information, and more governments are blocking data from leaving the country. More countries are becoming like Singapore—not Switzerland; data restrictive, setting their own standards and rules.

Additionally, when you look at data in a global firm, you'll see different reference masters that can exist between regions. Ultimately, these reference masters need to be brought together to perform enterprise functions. It gets even more difficult when you consider that global masters consist of multiple domains of reference data.

A global asset master that links or consolidates the individual security masters that may exist across various regions is required. That asset master needs to contain the different identifiers that are specific to certain markets.

All of this becomes a challenge when you consider that the theme of regulation is being able to come up with a harmonized view globally, which requires all these things to be brought up in a lingua franca—a common tongue. These issues highlight the need to be able to manage data operations on an enterprise scale, rather than in silos, as in the past.

Gertel: SIX Financial Information provides regulatory data to satisfy many regulatory requirements worldwide. A key characteristic of the regulatory landscape is that many requirements are cross-border in scope, like Fatca, FTT, AIFMD and MiFID. Global regulations like these allow SIX to provide a global offering to its customers.

Meanwhile, other regulatory requirements are far more localized in scope, for example local tax regimes like the stamp duty tax in the UK. Data offerings of this type require local expertise for us to be able to provide the necessary level of service. It is very important that the implementation process includes an ongoing dialogue between us, our customers and local regulators.

Finally, data operations must be aware of differences between regulatory bodies that could have an impact on their work, such as deadlines, support, transparency and communication channels. However, this is the challenging reality the industry has to live with.

Robert-Dautun: By nature, regulatory bylaws must be inherently thorough. Even if all regulations point in the same direction—toward transparency, better governance and higher quality, they have not chosen the same standards yet. This poses a conundrum for companies, as regulations require various types of data, and the data must be stored. Today, it is impossible to meet all the criteria without foundations capable of providing data checks. I believe this is the only way to deal with multiple regulations.

terry-roche-thomsonreutersTerry Roche, head of financial technology practice, Tabb Group: Global trading organizations must structure their compliance efforts to meet all of the regulatory demands of all the markets they trade. Therefore, the more markets one trades, the more obligations one must satisfy. As many to most organizations trade markets in multiple geographies, the compliance demand becomes more significant.

Are data management or data governance changes or trends being influenced more by regulation, or by other factors such as cost or ­efficiency?

Krishna: For the larger banks, regulations have had an overriding impact on data management and governance. A host of important provisions in Dodd-Frank depend on good data at their core (like CCAR and the Volcker Rule, among others).

Regulators forcefully express their view that good data is paramount in compliance to these regulations. While it is possible, and often necessary, to fix data in a tactical manner (this is especially true for historical data), sustainable change to data quality can only be achieved via proper data governance including the introduction of data management tooling and changes to underlying infrastructure.

Of course, introduction of these underlying changes leads to an improved cost profile over time as well.

michael-engleman-gftMichael Engelman, data science practice lead, GFT USA: Before the financial crisis, we saw a lot more firms driving decisions based on overall strategy and enterprise-wide goals. Many of these goals centered, of course, around cost and efficiency. Unfortunately, that meant high quality, time-intensive data governance programs were often put on the back burner in favor of the faster, cheaper solution.

Now, regulations have mandated implementation and forced businesses to make data governance a priority. However, this doesn't mean cost and efficiency ceases to be important—if anything, the extra cost of these data programs means companies need to find ways to use that data to gain new value in order to remain profitable. The most successful companies are those that are utilizing the stricter requirements around data to find ways to better serve their customers.

Gertel: All the factors above influence data management and governance. Regulatory requirements are one of the main reasons why firms invest in data management or data governance. Firms have to make sure their data has the necessary coverage and quality for them to be compliant with the requirements, and this increases the cost of data management.

However, regulatory requirements also act as a trigger for firms to improve their data management and governance: improvements in data integrity and quality, monitoring processes and reporting create opportunities to deliver better customer service.

Regulatory compliance can therefore make a positive contribution to business strategy goals by forcing firms to find more efficient ways to approach data management and data governance.

Robert-Dautun: Both Basel III and Solvency II regimes have included governance and supervision processes guidance within pillar 2 of their respective regulations. The Basel Committee on Banking Supervision also included references to data aggregation as part of its guidance on corporate governance.

This clearly shows the way. We now recognize its benefits as it improves our capabilities, as well as data quality. However, internal solutions are not the only way to comply with a specific regulation. We always have to balance the cost of data maintenance against outsourcing. Sometimes it is not worth building internal data management.

Roche: Regulations have been a tremendous driver of data governance and, by extension, cost. Compliance with regulation is being achieved through investments to collect all the data needed from many legacy technology silos within trading firms, use of technologies such as natural lingual processing to review unstructured data found in email, chat conversations and voice logs, acquisition and/or development of compliance algorithms to interrogate the data along with using machine learning to improve the algorithms, and the ability to fully audit the data.

How much activity and energy is being devoted to changing data operations to comply with regulations? Have efforts been adequate to ensure readiness to comply?

Krishna: Banks have clearly spent a lot of effort and investment in addressing any challenges and inefficiencies with data operations. These efforts span not only central shared services such as finance and risk, but extend to other areas including the front office and operations.

This kind of extensive transformation can take a long time and affect areas including organizational structure and technology, so banks have planned multi-year projects. In the meantime, however, banks have taken tactical steps to achieve compliance to the required regulatory dates. Over time, banks expect to improve their efficiency of these tactical operations by replacing them with strategic solutions.

Engelman: Activity around data operations is being driven by approaching regulatory deadlines; companies are pouring an unprecedented amount of resources and capital into compliance. This is to be expected. The difficult thing is being able to say whether efforts have been adequate—nobody really knows, yet.

Firms are collecting reports, they're ensuring compliance internally, but it's hard to know what the regulators will say. We had an institutional client who was examined by a regulatory agency that found the quality of their data ‘sufficient'—that's as much guidance as anyone has gotten.

At this stage, until more reports are audited and full examinations performed, it's impossible to say if firms' efforts have been sufficient. However, we're seeing priorities and behaviors around data governance mature in the last several years; the industry is certainly on the right path towards a higher quality of data operations.

Gertel: Both from a client and a vendor perspective, we see a significant amount of effort being put into regulatory compliance. At SIX, a large pool of employees is dedicated to meeting regulatory requirements that range from compliance specialists, product managers, business analysts to IT specialists. This enables SIX to offer its customers high-quality data and value-added solutions. Furthermore, the entire regulatory team constantly monitors developments in the international and national regulatory landscapes, not only for new requirements but also for amendments to existing regulations.

To ensure they are ready to comply with regulations, firms should have the adequate resources, procedures and expertise in place to deal with the requirements, together with IT systems that are designed in a way that facilitates the efficient implementation of regulatory amendments.

Robert-Dautun: We started to change data management in 2008. This was not driven by regulation but by the need to have a data warehouse that feeds all our calculation engines. This helps us today to comply with regulations, as the work that has been done is at the heart of data aggregation and reporting. We also upgraded our ETLs and calculation engines in order to limit the manual inputs and operations.

What has been the most notable data operations or governance advance devised as a result of a regulatory compliance need? What benefits may that effort also produce?

Krishna: There are at least two different areas where significant change is being achieved—some might even say it's revolutionary. One trend is in data governance, including the appointment of chief data officers, who are increasingly seen as broadly responsible for data quality efforts. To be properly implemented, data governance requires identification and appointment of executives across the organization whose focus is largely—if not wholly—on data. A number of banks are also making another major change, to their data integration efforts, by implementing data warehouses, enterprise-wide master data and metadata solutions.

Morley: This is the good news—the benefits have been many. From a regulatory perspective, the concept of an enterprise-wide activity risk assessment is hugely valuable to firms. Companies are now able to evaluate their risk by customer, by product, and by line of business.

We're seeing a lot of firms turning to user experience experts to create new dashboards and tools to read and manage data, perform data validation, and unify the language of data across the enterprise. For example, GFT's SuperNova solution creates user-friendly risk visualizations, which tap into enterprise-wide data to create intuitive, real-time representations of company risk.

In addition, as I mentioned before, firms are learning more about their customers, and becoming able to create new opportunities based on the knowledge gained. Moreover, as data quality improves, they are more certain of the reliability and validity of their reports, and can make more informed decisions. In these ways, companies are finding new competitive advantages arising out of their data governance practices.

Gertel: Investment that results from regulatory compliance requirements can be used by firms to deepen their knowledge of their customers. For example, the KYC information gathered under Fatca and the OECD CRS (Common Reporting Standard), together with investor protection requirements, generates additional data about clients. This allows firms to provide their customers with a more tailored offering, such as investment products that match their risk tolerance. Increasingly, regulatory requirements also force firms to improve their IT infrastructure. These measures improve the data quality as well as the monitoring and reporting mechanisms within firms, which will help them not only to be compliant, but also to attain their business objectives.

Robert-Dautun: With increasing compliance burdens from the ever-growing number of regulations, it became essential to develop a data management strategy that addresses data quality, data governance and data storage without limiting the flexibility needed to meet all current and future regulations. The audit we ran in 2008 led us to build a data warehouse that can run data through flexible calculation engines in various domains. This unique data center now feeds all the company data: from portfolio managers to clients and compliance. This has hugely improved data quality and analytics.

Roche: The leading firms have recognized the collection of all the data associated with trading will not only enable a compliance program, but enable those firms to find insight from the data. Risk management, predictive modeling and other benefits can be obtained from the data, plus there's an opportunity for the most advanced firms to find ways to monetize the data itself.

Another example here is some of the initiatives taking place at the enterprise level to streamline the collection of KYC (Know Your Customer) data, a function that has tended to be duplicative and challenging for business units selling different products. Doing this once will have a huge impact on the time spent by various teams collecting the same data.

This house-cleaning activity at the firm level will also be a first step to connecting to the various AML/KYC industry initiatives, which will add even more efficiencies to a usually difficult process.

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

FCA declines to directly regulate market data prices

A year-long investigation by the UK regulator to determine whether competition is hindered in the wholesale data markets has concluded with its decision not to directly regulate much-maligned data pricing and licensing structures.

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here