The Devil in the Data Of Financial Reform
November 12 2010 by Allan D. Grody And Dr. Robert M. Mark
New requirements in the Dodd-Frank Wall Street Reform and Consumer Protection Act have significant consequences for controlling systemic risk. The legislation empowers the Office of Financial Research (OFR) to conduct systemic risk analysis for the new Financial Stability Oversight Council (FSOC). The research office is given primary responsibility to set data standards and then to collect the resulting data on transactions and valuations to carry out this mission. Unfortunately, the office cannot fully deliver on its promise of monitoring and responding to systemic threats.
The OFR is mandated to build a reference data facility and a data repository, at an estimated $1 billion, to be populated by transaction and position data that systemically important financial institutions will be required to report.
A large portion of this information is already sent at the required granular level to financial market utilities (such as clearing and depository agencies), and more will find its way into central trade warehouses, swaps execution facilities and central counterparties for OTC derivatives. There is no provision to source this data directly from existing financial market utilities in the legislation. However, under the Act’s Mitigation of Report Burden obligation, before requiring the submission of data, the FSOC, acting through the OFR, shall rely on information available from other agencies. This should result in the Data Center’s obtaining aggregated position and transaction data already sent to depositories and clearing houses from other Council members. This approach would be more efficient than obtaining this same information from financial institutions. It will require immediate coordination with yet-to-be enabled agencies which, in their own independent determination, would have to request this data.
A Data Deluge
The OFR also faces an enormous reconciliation task to correct errors in the data and will need to check with the originating sources periodically to ensure conformity and accuracy. It will require data to be sent in a standardized format (with yet-to-be-determined identifiers) for instruments and counterparties and other attributes needed for valuations and risk assessments. One glaring omission is that non-financial business entities, along with their internal hierarchies of business ownerships that are parties in or reference entities for financial transactions, were not included in the final legislation. The OFR will thus have incomplete data sets unless the legislation is amended.
Still, it will be tasked with trying to fix data from multiple firms, sent at approximately the same granular level, at the position and transaction level, as in their own databases. Such reconciliation is what each financial firm’s operating plants are geared to do now, and at enormous individual expense. The data deluge from these new requirements could be immense and costly, and may overwhelm this new government agency.
Pulling harmonized position and market data together across one global organization to get a single timely firm-wide view of risk around a standard risk measure such as VaR (Value at Risk) has been a challenge. Firms typically aggregate each business segment’s VaR numbers across many risk types (e.g. market risk, credit risk and operational risk) for a firm-wide risk view. Regulators accept many rules of thumb for this aggregation across risk types in the face of limited reasonable alternatives to the data aggregation problem. Still, VaR’s validity is often challenged because of the lack of data and associated model risk. For example, members of the securitized mortgage group of Goldman Sachs testified at a recent congressional hearing that they had disregarded their internal standard VaR benchmark, believing that it wasn’t accurate as a true indicator of risk for running their business.
Systemic risk cannot be dealt with from regulatory silos. A single regulator (no matter how much of the market it oversees) cannot compel other sovereign jurisdictions to comply. A system-wide financial crisis may be caused by the contemporaneous failure of a substantial number of financial institutions, financial markets, or both. Systemic threats cannot be comprehensively detected without a global view.
The G-20 has given the global responsibility of systemic risk analysis to its own Financial Stability Board. The OFR portion of the legislation makes no mention of building upon this mandate, although the broader legislation does suggest cooperation with it or other government’s efforts or other like-minded regulatory cooperatives through international policy coordination by the President. It would not be difficult to incorporate a global data standard within the President’s international policy coordination mandate. This would go far in providing the aggregation mechanism across firms for global systemic risk analysis.
Systemic risk is a global phenomenon and it needs to be measured across multiple global regulators. The Basel capital standard administered at a global level (no matter its flaws) is a governance model we need to emulate for global data standards. It’s proven a reasonable model for transcending regulatory silos. We also need to ensure that the data-collection effort does not compound and reinforce the business silos that dominate our largest financial institutions. This silo governance structure has often proved inadequate to prevent systemic risk and will most likely thwart the effort to integrate position data needed for optimal risk analysis, whether across a single financial institution, or across multiple firms, or across regulatory silos.
In Search of Standards
We should encourage a search for a less costly and more effective way to obtain systemic-risk data that will serve both the industries’ and the regulators’ long-term interests. We should explore, for example, whether it is better to have governments compel financial institutions first to standardize their data across the many silos of their own businesses, rather than to ask them first to send data to one of what will become many government data centers. This will drive the industry toward agreeing on universal standards in the financial supply chain, similar to most other segments of the global economy, which have already done so in applying universal product, business and location standards to the global trade supply chain.
Thereafter, regulators could require industry members to store data in their own facilities (perhaps overseen by their external auditors), using these standardized data sets for the accumulated position and transaction information needed for observing systemic risk exposure. There will be real benefits both to financial institutions, in improving their own efficiencies and risk practices, and to governments, both of which will be able to reduce their chances of being blinded by some newly emerging risk.
In any event it would be practical to coordinate data that financial market utilities already have so that we don’t duplicate data that exist in aggregated form elsewhere. Regulators can then use Google-like search engines to access the data automatically, making use of data tags, common identifiers and specialized computer software to aggregate these data across the firms. Presentation-type software can also be used to visualize the data and ease many of the challenges in reconciliation.
The data can be made available in redacted form so academics and practitioners across the globe can use it to develop models for analyzing systemic risk. In other words, this information can give universities and institutes worldwide, as well as financial institutions, think tanks and the financial research units of various government entities, the tools they need to get a better handle on the financial risks facing the world.
It won’t be easy, but this approach is doable and has the benefit of being acceptable to industry members.