Bank Failure Prediction Models: Less Is More?

Thursday, 2 July 2015: 2:15 PM-3:45 PM
TW1.2.03 (Tower One)
Manuel Merck, Universidad Castilla - La Mancha, Toledo, Spain
Agustin Alvarez-Herranz, Universidad de Castilla-La Mancha, Albacete, Spain
Alvaro Hidalgo-Vega, Castilla-La Mancha University, Toledo, Spain
After a period of relative stability during the 90s, the last wave of distorsions in the financial markets since the end of 2007 to early 2008 has caused a new raise in the levels of bankruptcies, which –in turn- has led to new concerns among bank customers and investors, prompting a particular focus about this issue in the specialized media. This process has induced a renewed interest to identify the banks closest to its bankruptcy, or at least what bank institutions are more likely to suffer financial imbalances in the near future.

This paper analyzes the key underlying factors behind the bankruptcies of U.S. banks during the 2008-2014 international financial crisis. This study seeks to identify to what extent the accounting structure of the failed bank’s balance sheets show sustancial differences with respect to the average profile of the U.S. banking industry, comparing two approaches: a traditional EWS system inspired by standard CAMELS methodology, and a much less complex ‘naive’ or heuristic indicator, such as the Texas Ratio, with the purpose of testing: the rational expectations approach and modern macroeconomic and finance framework, dominated by models of decision-making under risk versus the paradigm of optimal choice under uncertainty.

The main findings have been: The conometric models based on CAMELS specifications have achieved slightly better results in our work, in terms of global explicability of the endogenous variable -the event of banking failure-, when compared to ‘naive’ and less complex models, based on three analysed versions of the Texas Ratio. CAMELS models whose specification include the Texas Ratio -replacing capital adequacy and asset quality variables- show a clear improvement in their explanatory/forecasting power, tested by a decrease in % of Type I&II errors. Through the selection of an optimal cutoff point on the model that includes just failures during 2008, based on the relationship between the number of failed banks with respect to the total of banks in our sample (4316), our model accomplishes a maximum level of sensitivity&specificity (lowest levels of Type I&II errors). It’s worth mentioning that this model overcomes –at least to the best of our information- not only the forecasting power of the Texas Ratio, but also the relating predicting capacity of the early warning systems applied by the federal supervisors in the United States, as our specification achieves –simultaneously- levels of sensitivity (true positive rate) and specificity (true negative rate) above 90%.

The main policy recommendations arising from the positive results of this study are to promote and encourage the development of simplified (not simple) quantitative models, to the extent that they are able to explain the main outlying drivers that influence the expectations of survival in the banking sector. To fortify a dual supervisory approach, including remote monitoring systems to complement and strengthen on-site supervision, which is the most efficient tool to identify the degree of reliability of bank accounting and financial statements, and the only way to understand the processes of identification, assessment, control and mitigation of the main risks affecting banking institutions.