Much of the discussion around CCAR has been focused around the scenario modeling required by the regulators. While I agree scenarios are a major part of the tests, they only represent one half of the picture. The other half of course is related to D
Much of the discussion around CCAR has been focused around the scenario modeling required by the regulators. While I agree scenarios are a major part of the tests, they only represent one half of the picture. The other half of course is related to Data.
Scenario modeling is a complex process and requires banks to assemble teams of consultants, data scientists, PHDs, and economists to debate and develop various aspects of the regulatory requirements. While this is not an easy task, most organizations are able to quickly build or buy the necessary resources to accomplish this task.
Quality Data + Quality Models = Accurate Scenario modeling
So how does one source quality data from within the organization? This question is often asked, and rarely has an easy answer. The ability of a bank holding company (BHC) to source high quality data is dependent on sound Data Governance practices. As with most other types of businesses, most CCAR banks have spent the past two decades defining and improving Critical Business Processes (CBPs). These activities have yielded significant efficiencies in day-to-day operations and have led to advanced and complex products and services for clients and customers.
As banks are brought into the CCAR fold, and the regulators become more intelligent around Data Governance (DG), most banks are scrambling to establish robust DG programs. But unlike models, the effort required to implement DG programs is many fold larger, more complex, and requires massive organizational collaboration.
For the most part, defining Critical Data Elements (CDEs) and creating governance mechanisms around this data set has been an after thought, if not completely neglected by some.