Auriscon Auriscon

Sidebar

  • Home
  • Expertise
    • About Us
    • History
    • Technical Audit
    • Model Development
    • Business Processes
  • Insight Contents
    • IFRS 9 and Expected Credit Losses
    • Applications of Scenario Analysis in Finance
    • Analytics and Data in Auditing
    • Default Risk in the Trading Book
    • Methods of Machine Learning
    • System Dynamics Insights
  • Explore
  • Contact
  • Home
  • Expertise
    • About Us
    • History
    • Technical Audit
    • Model Development
    • Business Processes
  • Insight Contents
    • IFRS 9 and Expected Credit Losses
    • Applications of Scenario Analysis in Finance
    • Analytics and Data in Auditing
    • Default Risk in the Trading Book
    • Methods of Machine Learning
    • System Dynamics Insights
  • Explore
  • Contact

Insights

System Dynamics Insights

Details
Insights
The System Dynamics approach was developed as a method for studying complex dynamical systems. A particular aim was to better understand how and when systems fail by producing an unexpected and undesirable outcome.
 
Disclaimer: Data, charts and commentary displayed herein are for information purposes only and do not provide any consulting advice. No information provided in this documentation shall give rise to any liability of Auriscon Ltd.

Content

  • The Applications
  • The Approach
  • The Model Building

The Applications

System Dynamics provide valuable insights connected to future long-term goals. This is achieved by displaying patterns and trends of system and process behaviour. Applications of System Dynamics are found in multiple areas, inlcuding Process Change Management, Product Development Planning, and Project Management. Examples for applications are outlined below.

 

    1. Management of Supply Chains

      Supply Chain Management allow companies to react to changing business environment by providing the right product at the right time with the right quantities to satisfy customer demand. Among other factors, effectiveness of supply chains is dependent on demand uncertainty and short product life cycle. Reducing instability in the supply chain and sharing information timely with all partners involved in the chain is therefore crucial to maintain profitability. A continuous evaluation of policy is key for anticipating and solving inventory problems. Simulation based on System Dynamics allows to test different scenarios for analyzing and improving inventory systems and to support decisions on policy updates.

    2. Production Processes

      Strategic Planning is a process applied in companies to adapt their decision making to future uncertainties. Strategic management therefore helps companies to form a vision and to select the right strategy. Different techniques are at disposal to support managers in their decision making. For example, System Dynamics can be used to simulate behavioural pattern in R&D development and supply chain management. In addition, hybrid approaches enable integration of multiple techniques such as System Dynamics for simulating aggregate system behaviour, Discrete Event Simulation for simulating operational processes and Agent Based Modelling for simulating how customers interact to make purchase decisions.
    1. Business Process Change

      Companies and their Business Processes continously face changes over time. Competitive advantages must be defended against other innovative companies and strategies must be developed to respond to drifts in customer demand. Improving organizational business processes to increase flexibility and enhance performance is therefore a key requirement for companies. System Dynamics is capable a method for Change Managment simulation, not at least by providing insights into feedback processes that determine the behaviour of changed processes. The outcome of any successful Business Process Change is rendered by accomplishing the targeted improvement in process performance, if required facilitated by a drastic change of the underling process. 
    1. Project Management

      The management of projects with high complexity and uncertainty can benedfit from the application of System Dynamics. System Dynamics provices a framework for  exploring scenarios of dynamic behaviour encountered over the life span of projects. Consideration given to re-work cycles can be integrated into the System Dynamics model to faciliate further insight into delays and project overruns. Root causes in overrun of schedule and projected costs when compared to the original planning can be examined. Simulation by System Dynamics can make transparent the multiple feedback processes and non-linear relationships that are characteristical for complex projects. In Product Development, System Dynamics models for simulating product development processes are concerned with allocation of resources to perform product development tasks and the effects of these allocations on quality, project time and development cost. From a managerial perspective, identifying the critical factors is increasingly more challenging as markets change and competitiveness between companies rises. A useful System Dynamics model is designed to represent the product development process in terms of its stages with alignment to Quality Assurance frameworks such as QS 9000 manual. As a result of analysing the project dynamics various critical factors, e.g. experience of staff, time spent on re-work, technologies used, that influence the timing can be concluded.

 

   

The Approach

The System Dynamics approach was developed in the mid 1950s as a method for analyzing complex dynamical systems, with the aim to better understand how and when systems and policies produce an unexpected and undesirable outcome. The usefulness of the System Dynamics approach is the conceptualizaton of a model that can be used for decision making in various areas including business and policy simulation, project management, product development and risk management.

In System Dynamics, a system is modelled as a combination of stocks and flows, with flows representing the rate of changes that lead to continious state changes in the modelled system variables. The underlying equations used to solve the model are difference equations which represent the discrete time dynamics of underlying differential equations. 

The stock variables are typically defined as a result of the qualitative modeling and the grapical design based on stock-flow diagramms. For a system to model its behaviour the flow in and out of the stocks require quantification. Quantification is achieved through definition of rate changes linked together via differential equations. Integration over rate changes is based on the defining differential equations and takes the form

\[Stock Level _{t} =Stock Level _{t-\Delta t} +\Delta t \cdot (Inflow Rate_{t-\Delta t} - Outflow Rate_{t-\Delta t} )\]

System Dynamics is successfully applied to solve problems in multiple areas. Typically multiple sources of information are used, including numerical data and interviews, to elicit the core information required for modelling a complex system. The advantages of using System Dynamics in Project Management are demonstrated below, but the same benefical features apply to uses in other areas as well.

 

System Dynamics models are capable to transparently exposee key behavioural characteristics in processes and systems. 

    • Multiple dependencies among system components are well captured by System Dynamics models and promote transparency about tracing the causal impact of changes throughout the system.
    • Systems exhibit a different behaviour over time. Perturbations to systems for instance cause short-run responses that converge to long-run response after any impact of delays. 
    • Large numbers of feedback relationships for balancing or re-inforcing are a common characterics shown. Tools such as GANTT charts will not solve or even exhibit the impact caused by multiple feedback processes. The System Dynamics approach by comparison is conceptually capbable to incorporate feedback loops and to forecast any impact.
    • Nonlinearity observed in large systems leads to non-proportinal relationships between causes and effects.System Dynamics model incorporate and project non-linear behaviour in the model formulation.

 

 

Demonstration of the Logistic Growth Model

Ordinary Differential Equations (ODE) are useful in describing growth phenomena in multiple areas. The logistic equation describes a growth pattern that is commonly observed in a context where competing forces and saturation effects lead to limitations in growth.
To illustrate, consider the growth of company revenues from selling a product until market saturation takes place. For a variable of interest y, its growth is limited by a carrying capacity. Specifically, growth is reduced with higher levels of y due to competing forces that pull the levels towards the carrying capacity. As a result, growth is initially exponential at a growth rate r. However, with increasing levels of y the growth turns negative above a carrying capacity K. The logistic ODE describing this phenomenon is written as. 

\[\normalsize \frac{d\,y}{d\,t} = r \cdot y \cdot (1-\frac{y}{K})\]

Considering two solutions to the logistic equation, each for an initial condition specified by \(y(0)=2\) and \(y=12\). Parameters are specified as \(r=1\) and \(K=10\). Displayed below are two solutions to the logistic equation. For each initial condition the matching solution is shown.

 

Data Analytics in Auditing

Details
Insights

Data Analytics - An Outline

Incrasingly more companies and financial institutions base or augment their decision making on data and analytics. For audit to keep pace with expectations, more analytics and advanced technology has to be integrated into the review process. The sampling process traditionally used in reviewing leaves the risk to overlook important review areas such as key transactions, models or business activities. Hence, using advanced analytics for reviewing can have a significant impact on quality and outcome. The review of analytical models renders an area where the use of advanced analytics suggests significant benefits. 

 → READ MORE ...  Guide on Risk-Based approach: Financial Action Task Force (FATF)

    


 

Machine Learning models methods are  meanwhile commonly used within the context of credit and fraud customer screening, marketing, and multiple other areas. 

Interpretability of both model input and model prediction is important for obtaining the acceptance from business and user community. Importance of interpretabiliy applies especially for black-box models used in Machine Learnings uch as boosted trees and similar black-box type of models. Interpretation should therefore be considered a key step before deployment of model takes place which is why review should take place. The questions users and business may ask include to ask whether a particular feature is more important than other features. Fortunately, verifying interpretability does not have to be model dependent. In fact, the model agnostic nature of methods used for interpretability checks is highlighting their usefulness for use in review and validation across multiple application areas and model types.

Data Bias can have a significant negative impact on models. Degradation in model performance due to data bias is a key issues in other areas of model development too. One reason for data bias is that prediction models are typically built and calibrated based on data sampled over a certain time period, and therefore the sampled data are conditioned based on internal and external events occuring over this eriod. Another reason for data bias can be due to the choice of certain business strategy. 

Compliance is an example for an area where attempts to prevent money laundering and to establish surveillance over transactions use machine-learning (ML) oriented approaches. Within this context, ML methods are meanwhile commonly used. Caveats can occur however due large numbers of false positives requiring operators to have to process all false positives of an outcome run. Clearly, a business objective in this regards is to minimize false positives and thus reduce the operational costs involved.     

From the examples just sketched it becomes clear that evaluating ML model is a laborous and complex task with risks of foregone opportunities for performance enhancements due to errors in the model development and review processes. 

 →   Auriscon can help to enhance data analytics used in review and validation    

 Validity of data and data quality are therefore important aspects any review should opine on. Aspects of data quality such a missing data or outdated data should receive sufficient attention to summarize data limations. Testing of data quality can also consider the data quality monitoring with respect to relevant data quality dimensions viz. completeness (no missings), accuracy (no outliers), timeliness, controlled data sources, usage monitoring. Review of suitable and controlled Data Sources links to data integrity requirements. More often that not a central data repository is used and review checks may cover controls implemented to ensure data consistency across several IT systems. Governance aspects pertaining to definition of Critical Data Elements should be considered depending on the scope and the scrutiny of the review.


   

Regulatory expectations on data integrity and governance establish demand for detailed metadata repositories, data lineage and automated data quality checks to be implemented with sufficient granularity and currentness.  Industry standards are set out holistically in the BCBC 239 report with internal reporting, regulatory reporting and management decision systems being in scope.  

Furthermore, institutions are expected to have data quality management and standards in place with data quality indicators and tolerance level attached. 

 

Consequently, technical leaned audit assignments should incorporate data quality and data governance aspects into the reviewing process.Typical challenges in implementing standars on data quality management are heterogeneous system landscapes where multiple source systems from which data are obtained are used across businesses and functions. Auriscon can help in identifying the inconsistencies and effectivity weaknesses that affecct the data quality measurement and reporting tools.    


 

 →  READ MORE ...   ECB guide on effective risk data aggregation and risk reporting    

 

          • Thematic Drivers in Technical Auditing

 

 

 

 

 

 

Model Design

Methodology may have limitations or model design may be incomprehensive leading to bias in model outputs.

 

Data Quality

Data quality may have weaknesses leading to incomplete, inconsistent or inaccurate data with negative impact on accuracy of model outputs.

 

Documentation Quality

Poor documentation quality with lack of detail on important model and methodology assumption hinders effective validation and may lead to wrong model use.

 

Governance Effectiveness

Ineffective governance on thematic aspects such as data, models, IT, risk may lead to breaches with regulatory requirements.

 

   

Reviewing Model Design:

Typically the review part in this section concerns the model assumptions and limitations alongside a description of the model. Theoretical and mathematical exposition should reference literature and classify best practice whenever possible.

In addtion, the calibration process and details should be explained and assessed against potential gaps and weaknesses in relation to model uses.

The audit testing of adequacy of Model Design would be suitably expanded to cover comparisons to industry practice and model benchmarking. 

Robustness of Model Design should stand the test of time and the test of stress. Risk drivers should be captured to the extent to permit model reactivity in times of economic stress.

 

Reviewing Model Performance:

Reviewing the Performance of the Model is best executed against clear criteria.

Common approach is to set up these criteria before model use and to establish a documented performance monitoring with tresholds on tolerated and alerted breaches of model performance.  

Performance monitoring would capture model use and design issue, as well as issues pertaining to proxy data and model calibration.

The approach to escalation of breaches in model performance should be defined as part of Model Governance. A clear path for escalation reporting should be communicated and documented.

 

 

  

  

 

Default Risk in the Trading Book

Details
Insights

The Regulatory Requirements 

The Basel Committee on Banking Supervision (BCBS) replaced the previous Basel 2.5 standards with new standards on Minimum Capital Requirements for Market Risk (FRTB). A revision performed under the Fundamental Review of the Trading Book (FRTB) aims to correct biases in Basel 2.5. As before, a Standardized Approach (SA) and an Internal Model Approach (IMA) apply, Hoqwcwe, this time any regulatory approval for the Internal Model Approach (IMA) is specific and separate for each trading desk. Moreover, Standardized capital has to be calculated on a shadow basis  involving Back-testing and a P&L attribution testing to ensure breaches in the application test criteria are detected. 

Following is an outline of the Default Risk Charge (DRC). Originally imposed by regulatory authorities as the Incremental Risk Charge (IRC) to provision for credit risk losses in the trading book. The DRC metric is defined as a 99.9% VaR over a 1 year horizon, accounting for losses in the trading book due to default and rating migration in credit risky securities. The characterical features of the DRC model have to comply with specific regulatory requirements. 

  • Computed as a 1-year VaR.
  • Only one single jump-to-default risk.
  • Credit risky positions in the trading book covered including listed equities and derivatives.
  • Factor model with two types of systematic factors.
  • Liquidity horizons and constant positions over a 1-year horizon.

 

The Default Risk Charge Model 

The DRC VaR is calculated from the distribution of simulated portfolio value changes at the 1-year time horizon based on the default / non-default mode. A Factor Model is used to simulate default events. In the Factor Model, an issuer default event at the horizon is detected by comparing the firm’s asset return value r_i at the horizon with a threshold \(c_i=N(p_i)\), using the issuer’s one-year probability of default \(p_i\).

 A firm’s asset return at the 1-year horizon is explained by normal distributed composite factor \(Ψ_i\) and residual \(ϵ_i\). The portion of the total variance of the standardized returns  that is explained by the systematic factors is captured by the R-squared.


\[r_i=√(R_i^2 )⋅Ψ_i+ϵ_i \]

\[with \,\,\, r∼N(μ,Σ)\]

A decomposition of the composite factor in \(j=1,...,M\) systematic factors \(F_j\) is performed in similarity to Moody's GCORR Factor model. Specifically, global factors \(F_(G, j)\) and sectorial factors \(F_(S, j)\) are used to represent systematic factors. Sectorial factors are further partitioned into region, country and industry factors. In addition, company size and credit quality are considered as part of the sectorial dimension. 

Orthogonality of factors ̃\(F_i\) is desirable and obtained from application of principal component analysis (PCA). Specifically, by means of the PCA method the initial correlation matrix \(C\) is decomposed into a diagonal matrix of ordered eigenvalues \(Λ\) and an orthogonal matrix of eigenvectors \(V\). The set of K orthogonal factors is obtained by selecting the eigenvectors associated to the largest K eigenvalues. 


\[C\,=\,V⋅Λ⋅V\]

 

Capture DRC1

 

Global factors defined by the set \({F_(G, i) }_((i=1,...,N_G))\) and sectorial factors are defined by the set \({F_(S, i) }_((i=1,...,N_S))\). Furthermore, industry factors \({F_(I, i) }_((i=1,...,N_I))\) , regional factors \({F_(R, i) }_((i=1,...,N_R))\) and the Size factor \(F_Size\) are used, each defined as a sub-set of sectorial factors. Factor weights are obtained from a linear regression of standardized returns on systematic factors.

Measurement of correlations is based on weekly equity returns and weekly sovereign returns with the latter being derived from 5Y CDS spreads. The sample correlation matrix \(C\) is computed from the (N x T) data matrix X of N issuers each with a time series length T at a weekly frequency. The parameters considered are the measurement interval T, the calibration period and the stressed period.


\[C=T^(-1)⋅X⋅X^t\]

 

Simulation and Valuation

As compared to a single-period model, a multi-period model incorporates the dynamics of default risk over time and addresses the DRC regularity requirement of liquidity horizons. N.B. liquidity horizons for cash equities and derivatives can be shorter than a 1-year capital horizon. Additional consideration is placed on the following:

  • Non-linear impact of derivatives w.r.t. loss from default.
  • Netting long/short to the same obligor has to account for seniority for different instruments.
  • Basis risk between long/short of different obligors must be modelled explicitly.

Simulation method 1: The simulation of correlated default times \(τ_i\) is performed using the distribution of default times \(F_τ (t)\) as determined by the issuer’s PD term-structure \(p_i (t)\) and the returns \(r_i\) as simulated from the factor model.
\[τ_i=F_τ^(-1) (N(r_( i, T)))\]
For each issuer i realizations are drawnb from the composite factor \(Ψ_i\) and the residual \(ϵ_i\) with \(Ψ_i,〖 ϵ〗_i ~ N(0,1)\) and obtain the simulated scenario returns 〖\((r ̂_1,…,r ̂_n)〗_(j=1 ,…,K)\).
A vector of scenario of default times 〖\((τ ̂_1,…,τ ̂_n)〗_(j=1 ,…,K)\) is obtained from applying the standard normal to scenario returns and inverting the default time distribution.

With the assumption of a constant hazard rate \(h ̅_i\) exponential default times with the mean default time \(1/h ̅_i\) are sampled.
\[τ_i=1/(log(1-p_i (T)))∙log(N(r_i))\]
The Gaussian copula representation \(C_(2,Σ)\) of joint default times is
\[P(τ_1≤t_1,〖 τ〗_2≤t_2)=N_2 (N^(-1) [F_(τ_1 ) (t_1 )],〖〖 N〗^(-1) [F〗_(τ_2 ) (t_2 )]))=C_(2,Σ) (F_(τ_1 ) (t_1 ),F_(τ_2 ) (t_2 )])\]

Simulation method 2: Uses the Student -t copula \(C_(k ,Σ,df)\) for k issuers with df degrees of freedom and inputs the same correlation matrix of returns \(Σ\) as used for the Gaussian copula. However, the horizon factor return are modified using a Chi-Square distributed random variable \(X ~ χ^2 (d)\) .
\[r_i=√(df/X)∙[√(R_i^2 )⋅∑_(j=1)^M▒w_(i,j) ⋅F_j+√(1-R_i^2 )⋅ϵ_i ]\]

 

Capture DRC2

Machine Learning Methods

Details
Insights

bibliothek innovation 2

Advances in Information Technology (IT) based on improvements in hardware and software enabled the success of Machine Learning (ML). Today, the application of robots, automation, and models with capacity to learn from data are commonplace. This makes the tailoring of applications possible to draw insights efficiently from complex and high dimensional data sets.

 The performance of new developed models is evaluated on sample data not used for model training. A reasonable performance of the model on this validation sample would suffice as an indication for acceptable model performance on any new data. The probing of the model on validattion data is a necessary step: high performance on training data may indicate model overfitting leading to a model that is not performing on new data. Overfitting can be traced to noisy data and circumstances where intensive model training leads to an adaption to rare patterns only accommodated in the training data set. 

A consequence of model overfitting is poor performance on new data where the pattern the model has been fitted to is absent. 

Read more: Machine Learning Methods

IFRS 9 and Expected Credit Losses

Details
Insights

JA Edenite

As a result of IFRS9, the IASB demised the Incurred Loss standard and replaced it by the Expected Credit Loss (ECL) standard. IFRS9 accounting require banks to hold provisions for credit losses that are expected to occur, and the amount of provisious is subject to increase for loans whose credit quality has substantially deteriorated. 

The ECL is measued over life of credit risky securities in the banking book. The ECl is a standard that is based on a forward-looking approach with a view on anticipated expected losses. Loss estimation is performed at stages conditional on the loan life cycle.   

 

Read more: IFRS 9 and Expected Credit Losses

Applications of Scenario Analysis in Finance

Details
Insights

Our company

An important modelling decision faced by risk management is the selection of plausible scenarios to support forward looking risk analysis and stress testing. For example, the computation of Expected Credit Losses (ECL) under IFRS 9 requires an accommodation of suitably devised long-range scenarios.

A common component to applications of Scenario Analysis in Financial Stress Testing is the set-up and use of a framework. The framework provides the linkage between macroeconomic scenarios and risk metrics for the purpose of projecting financial metrics under the mulltiple economic scenarios.  

 

 

Vector Autorregression (VAR) is a useful method for analysing macroeconomic and financial data. Specifically, the VAR method permits analysing and accounting for the dependence and dynamics of the macro variables used for scenario building.

 →  READ MORE ...

 

Scenario Narrative

Stress Testing is well established as a risk management tool across financial institutions. This holds particularly true in in relation to Bank Capital, where regulatory requirements demand that banks adhere to regular internal and external stress testing exercises. A common question of interest to Portfolio and Risk Managers when evaluating the uncertainty of extreme market events is the following:

“How sensitive is my portfolio to losses as a result of macro and financial shocks?”

Macro Stress Testing aids in exploring how hypothetical future shocks deemed to affect the economic and financial environment transmit to the portfolio losses and the income conditions of financial organisations. A major objective of any stress testing exercise is to examine the solvency of and profitability of financial institutions in the advent of crisis. To facilitate this objective the design of the scenarios used in stress testing is therefore required to adapt to the specific risk and portfolio perspective of each organisation. This requirement is facilitated for example by incorporating suitable liquidity and funding shocks into the design of scenarios.

Scenario Design

The scenarios used for macro stress testing should be devised with view on risks that matter. Each financial institutions has a specific risk profile dependent on the prevailing economic and financial conditions over the periods considered. Notheworthy tha stress testinng scenarios are not meant to forecasts of economic conditions in the future!

Starting point for scenario design is the identification of macro and financial risks that could impact negatively on the financial or solvency position of the financial institution. Following the identification of key macro risks and variables with specification of shocks applied to these variables the portfolio impact can be analysed.

Market Risk Stress Testing

Despite the common standards in using Value at Risk (VaR) and Expected Shortfall (ES) metrics, significant residual risks remain due to extreme and random market moves in the risk drivers. Ideally, forward looking scenarios should be evaluated in Market Risk Stress Testing to identify plausible and extreme shocks. However, scenarios based on historical data are often preferred due to the traceability of evidence for identifying shocks.

After the design of suitable scenarios for the core assets a scenario expansion needs to be applied to capture the responses of the non-core assets to the shocks in the drivers. The modelling of the co-dependence between variables under stressed market conditions is essential too. Typically the scenarios provided in regulatory or internal stress tests specify the time path of some key variables only. To exemplify the case for scenario expansion, consider a scenario where GDP growth and inflation change of EU countries is available, but US macro and financial variables are left unspecified.  

Main Menu

  • Home
  • Expertise
    • About Us
    • History
    • Technical Audit
    • Model Development
    • Business Processes
  • Insight Contents
    • IFRS 9 and Expected Credit Losses
    • Applications of Scenario Analysis in Finance
    • Analytics and Data in Auditing
    • Default Risk in the Trading Book
    • Methods of Machine Learning
    • System Dynamics Insights
  • Explore
  • Contact

Consulting Service

  • Business Processes
  • Model Development
  • Technical Audit
  1. You are here:  
  2. Home
  3. Insight Contents
  4. Analytics and Data in Auditing

Contact

AURISCON HK Ltd
Unit 908, Prosperity Millenia Plaza, 663 King's Road
Quarry Bay, Hong Kong Island
Hong Kong
 
  • +852 95403530
  • This email address is being protected from spambots. You need JavaScript enabled to view it.

Company Details

Company No: 77564770

BR No: 77564770-000

Copyright © 2025 Auriscon HK Limited.
All rights reserved