Big Data

The Devil’s in the Data

Even as the new accounting standard for estimating the allowance was being announced in the summer of 2016, we knew the key issue – and the hardest part of transitioning to CECL – was going to be about data. We understood from the beginning that to comply with CECL we were going to need a lot of data, and that the data had to be of good, consistent quality. The concern remains paramount, in fact has grown. We see it everywhere: in webinars, white papers and conference presentations. The devil’s in the data.  […]

2018-10-30T16:34:57+00:00September 14th, 2018|Big Data, CECL|

Confessions of a Data Analyst

The implementation of CECL has been called the biggest change in financial institution accounting . . . ever. Under current U.S. GAAP, financial institutions account for losses based on historical events or incurred losses. Beginning in the first quarter of 2020, financial institutions must look at the past as well as the future over the full lifetime of a loan. […]

It’s crunch time.

SEC filers will start estimating their allowances according to CECL as of the first quarter of 2020, just a little more than a year and a half from now. Considering they will want to run parallel incurred loss and CECL methodologies for several quarters – a year is recommended – most lenders, including private companies, are knee-deep in preparations, setting up transition committees, gathering data, studying methodologies. […]

Improving Data Quality: Process Mapping Is a Start

Data quality has become a regulatory flashpoint. Not exactly breaking news. Not exactly news at all for any lender paying attention. Even so, too few financial institutions are auditing the accuracy of the data they will need to estimate their reserves under expected loss. […]

2018-10-25T15:55:43+00:00November 28th, 2016|Big Data, CECL|