Reviewing Analytics and Data - An Outline

Incrasingly more companies and financial institutions base or augment their decision making on data and analytics. For audit to keep pace with expectations, more analytics and advanced technology has to be integrated into the review process. The sampling process traditionally used in reviewing leaves the risk to overlook important review areas such as key transactions, models or business activities. Hence, using advanced analytics for reviewing can have a significant impact on quality and outcome. The review of analytical models renders an area where the use of advanced analytics suggests significant benefits. 

 → READ MORE ...  Guide on Risk-Based approach: Financial Action Task Force (FATF)

    


 

Compliance is an area where attempts to prevent money laundering and to establish surveillance over transactions various machine-learning (ML) oriented approaches have become established. Within this context, ML methods are meanwhile commonly used within the context of customer screening. Caveats occur however due large number of false positives as operators have to process all false positives of an outcome run. Clearly, a business objective in this regards is to minimize false positives and thus reduce operational costs involved.  

From the example just sketched it becomes clear that selecting, learning and evaluating ML model is a laborous and complex task with risks of foregone opportunities for performance enhancements due to errors in the model development and validation processes. Auriscon can help in identifying the inconsistencies the development and validaiton approach and point to foregone opportunities to enhance model design and validation.    


   

Data reviewing has become an ever more important part of technical auditing. As processes, models and transactions become more data intensive the number of variables involved has multiplied. As a result of this trend more data should be analyzed and more trends and patterns are identifiable as part of data auditing. Furthermore, IT tools have emerged to support integration of data managment and machnine learning making thus making it easier for non-IT specialists to analyze and interpret data.

 


   

Regulatory expectations on data integrity and governance establish demand for detailed metadata repositories, data lineage and automated data quality checks to be implemented with sufficient granularity and currentness.  Industry standards are set out holistically in the BCBC 239 report with internal reporting, regulatory reporting and management decision systems being in scope.  

Furthermore, institutions are expected to have data quality management and standards in place with data quality indicators and tolerance level attached. 

 

Consequently, technical leaned audit assignments should incorporate data quality and data governance aspects into the reviewing process.Typical challenges in implementing standars on data quality management are heterogeneous system landscapes where multiple source systems from which data are obtained are used across businesses and functions. Auriscon can help in identifying the inconsistencies and effectivity weaknesses that affecct the data quality measurement and reporting tools.    


 

   READ MORE ...   ECB guide on effective risk data aggregation and risk reporting    

 

          • Thematic Drivers in Technical Auditing

 

 

 

 

 

 

Model Design

Methodology may have limitations or model design may be incomprehensive leading to bias in model outputs.

 

Data Quality

Data quality may have weaknesses leading to incomplete, inconsistent or inaccurate data with negative impact on accuracy of model outputs.

 

Documentation Quality

Poor documentation quality with lack of detail on important model and methodology assumption hinders effective validation and may lead to wrong model use.

 

Governance Effectiveness

Ineffective governance on thematic aspects such as data, models, IT, risk may lead to breaches with regulatory requirements.

 

   

Reviewing Data and Identifying Limitations:

This should include a description of the data used for model build and for model operation. The latter references the data inputs.

Validity of data and quality are key aspects the review should opine on. Aspects of data quality such a missing data or outdated data should receive sufficient attention to summarize data limations under this section. Complications amy arise in the context of feeder models and data filtering and screening processes outside of the actual model development or operation.  

The testing of data quality for model build and life would be suitably expanded on sub topics if data classifications suggest this.  Testing of data quality should also consider the data quality monitoring with respect to relevant data quality dimensions viz. completeness (no missings), accuracy (no outliers), timeliness, controlled data sources, usage monitoring.

The reviewing of suitable and controlled Data Sources links to data integrity requirements. Often, a central data repository is used and checks may cover controls implemented to ensure data consistency across several IT systems. Governance aspects pertaining to definition of Critical Data Elements my have to be considered depending on the scope and the scrutiny of the review.

 

Reviewing Model Design:

Typically the review part in this section concerns the model assumptions and limitations alongside a description of the model. Theoretical and mathematical exposition should reference literature and classify best practice whenever possible.

In addtion, the calibration process and details should be explained and assessed against potential gaps and weaknesses in relation to model uses.

The audit testing of adequacy of Model Design would be suitably expanded to cover comparisons to industry practice and model benchmarking. 

Robustness of Model Design should stand the test of time and the test of stress. Risk drivers should be captured to the extent to permit model reactivity in times of economic stress.

 

Reviewing Model Performance:

Reviewing the Performance of the Model is best executed against clear criteria.

Common approach is to set up these criteria before model use and to establish a documented performance monitoring with tresholds on tolerated and alerted breaches of model performance.  

Performance monitoring would capture model use and design issue, as well as issues pertaining to proxy data and model calibration.

The approach to escalation of breaches in model performance should be defined as part of Model Governance. A clear path for escalation reporting should be communicated and documented.