The financial crisis of 2008 demonstrated very well that a lack of pertinent information can lead to very adverse circumstances. Firms operating in financial services, customers, and particularly regulators came to understand very quickly that information was key to risk reduction, and we are now witnessing ongoing efforts by regulators to ensure that firms operating in financial services generate transparent, well-managed information. On the face of it the demands of regulators look burdensome. In reality they offer an opportunity for firms to improve strategic and operational activities through risk reducing, relevant, well managed information.
As is often the case, managers face two conflicting demands. On the one hand they need to employ ever more powerful analytical techniques to remain competitive, while on the other hand the models they use must be transparent and relatively easy to explain. Both BASEL and Fed/OCC guidelines require rigorous documentation across model lifecycle, and particularly ongoing model performance in production. The BASEL requirements are enforced by regional regulators, and they consistently demand more discipline around model management. Similar regulation for insurers comes in the form of Solvency II, and stress testing is demanded in all regions.
More generally banking regulators are focused on soundness of decision making, capital adequacy and unlawful or unfair treatment of customers . They’re scrutinizing how firms use analytics to manage risk, measure capital reserve requirements, make customer decisions and ensure consistency across operations.
Because the use of analytical models, and particularly predictive models has become ubiquitous in larger firms, regulators require processes to be implemented that address several key issues. These include regular credit risk policy reviews, creation of suitable samples for model creation and testing, segmentation transparency, use of appropriate model types, validation of model effectiveness, performance tracking, monitoring of overrides, decision strategy transparency and last, but certainly not least, thorough documentation production.
Clearly this is not really business as usual. If advanced analytical techniques are used there is now a substantial obligation and requirement to manage the whole process of originating and using the resulting models. It simply is no longer enough to create a model, deploy it into production and leave it to do its thing with some checking now and then. A set of capabilities and processes are required to ensure that every aspect of model creation, deployment and performance is well understood, managed and documented. This implies additional technology infrastructure and methods, since in larger firms the number of models in use might be measured in the thousands. This represents a significant shift to much greater sophistication. Without such a move managers will find themselves floundering in the complexity that has silently grown around them. For example the modelling staff in one major US bank now spend 80% of their time meeting regulatory requirements, detracting from much needed new model development. Worse still some banks simply do not know how many models are actually deployed.
It should also be stated that regulatory requirements affect most aspects of operational activity. Conduct risk for example requires that customers are protected from rogue selling activities, and to this end models are being implemented which analyse sales and other data with the intent of identifying inappropriate behaviour. Regulators will want transparency and understanding here, as with all other predictive models that are being used.
Everyone understands that the availability of relevant information reduces risk. The production of such information is, in a nutshell, the purpose of all regulation, and meeting this growing requirement presents a significant opportunity to reduce risk within the business itself.
The next article in this series is Predictive Models – Risk and Benefits.