Aufrufe
vor 4 Jahren

RISIKO MANAGER 10.2019

  • Text
  • Banken
  • Informationen
  • Beispielsweise
  • Parameter
  • Historical
  • Blockchain
  • Schadenanzahl
  • Unternehmen
  • Risiken
  • Risiko
RISIKO MANAGER ist das führende Medium für alle Experten des Financial Risk Managements in Banken, Sparkassen und Versicherungen. Mit Themen aus den Bereichen Kreditrisiko, Marktrisiko, OpRisk, ERM und Regulierung vermittelt RISIKO MANAGER seinen Lesern hochkarätige Einschätzungen und umfassendes Wissen für fortschrittliches Risikomanagement.

10

10 RISIKO MANAGER 10|2019 Fig. 03 Relevant sources of data Customers Facilities Utilization Collateral Incidents Losses and Costs Other data Completeness, accuracy (data is substantively error-free), consistency (a given set of data can be matched across different data sources of the bank); timeliness, uniqueness (aggregate data are free from any duplication from filter other transformations of source data); validity; availability/accessibility and traceability must be explainable. The data processing procedures (collection, storage, validation, migration actualization et cetera) should be properly defined at the bank level. Data quality review Activities to be performed The technical assessment must differentiate between a high-intensity versus low-intensity review. In the low-intensity review, tests will be performed by the bank, and the assessment team will mostly rely on the results and documentation presented by them for its final conclusions. The consultant must go through the technical implementation of the definition of default, where the bank is supposed to deconstruct the end-to-end process and provide the data and documentation about its policy on the internal definition of default. The consultant should identify the functional and technical documentation of the implementation of the definition of default into the bank's IT system. In the high intensity review, the assessment team challenges the performance of the tests through replication and spot checks when possible, and select more optional tests where relevant. The following data quality framework describes a mix of required questions and activities required by the regulator. Analysis of IT infrastructure and architecture For the analysis of IT infrastructure, the consultant must pay specific attention to centralized versus decentralized infrastructure. Focus must be put on the aggregation process when there is a decentralized approach: » existence of a data warehouse (DWH); » own build versus vendor systems; » risk versus finance systems; » process upon data failures. This includes the consultant analysis of main features and functionalities, the granularity of the information available in the different data systems and the possibility of versioning for data points. Errors and/or adjustments (e.g. overrides must be logged, and automatic/manual IT controls must be in place at the application/system level, what controls are available to ensure completeness (e.g. checks on failure, input versus output) and what functional/technical documentation is established. Particularly for those activities implemented for the processing of historical PD and historical LGD data, consistent data dictionaries and definitions for all relevant databases and interfaces should exist. The following elements must prioritized: » data treatment activities; » data enrichment (e.g. default flagging); » overrides (e.g. controls, traceability); » use of unique identifier throughout the process; » golden source approach versus duplication of process steps; » automated versus manual processes; » triggers for the next process step (scheduling versus manual); » logging of manual changes. Which relevant processes of data extraction, transformation and criteria are used in this regard? Besides the relevant functional and technical specification of databases and data models, relevant workflows and procedures relating to data collection and data storage, definition and documentation of the approach and the test should be available. Results must be formalized and responsibilities described. Minimum items have to be fully documented. Maps of databases involved in the IRB model have to be drawn. Internal assessment process The assessment of principles should be carried out by an independent unit whose recommendations should be issued with an indication of their priority. All data quality issues identified should be recorded and monitored there. For each of the data quality issues, one owner – responsible for resolving the issue – should be appointed and an action plan for dealing with the issue should be scheduled based on its materiality. Remediation timelines should depend on the severity and impact of the issue and the implementation timelines required to resolve it. Banks must establish and implement an effective data quality framework. Elements for prioritization include: » choosing methods for criteria of transformation and processes of data extraction; » internal assessment processes; » assessment of principles implemented by an independent unit; » appointment of a data owner responsible for resolving data quality issues; » support for indicators by effective data quality checks and controls, ranging from data entry to reporting. Data quality reports for specific rating systems that include the scope of the report or reviewing process. Referring to infrastructure the following minimum items should be documented:

Kreditrisiko 11 Tab. 03 Overview of onsite investigation assessment plan Execution of qualitative assessment Execution of technical assessment Final Assessment Report (to be integrated with Credit Risk findings REPORT) » Interviews and walkthroughs with Representatives of the bank on infrastructure and architecture » Walkthroughs on definition default technical implementation and computation » Selection of additional optional ad-hoc tests analyses for the technical assessment » Review of the outcome and processes of the checks performed by the bank for the selected PDILGD variables. » Challenge, replica and request and review of additional analyses and tests (particularly for HIGH INTENSITY) » Decisions on findings conclusions for both the qualitative (compliance with expectations) and technical (results from the tests) assessments » Consistency check and approval of final report. Fig. 04 Data quality report Data quality report Data quality report » global map of databases involved in the IRB model; » relevant sources of data; » relevant processes of data extraction are transformation and criteria used; » relevant functional and technical specification of databases and data models; » relevant workflows and procedures relating to data collection and data storage. » Cope of the project » Findings and recommendations » Address and implementation of recommendation » Cover all stages of the IRB life cycle » Sent to the management body, committee on a regular basis A data quality report for a specific rating system should include the scope of the report or review, which should provide an overview of the performance of the model in terms of data quality, especially external data. The findings should include recommendations to address fraud, weaknesses, and shortfalls. Adequate evidence should be showing that the recommendations have been adequately addressed and properly implemented. The report should provide sufficient coverage of the quality of data at all stages – the IRB lifecycle, from data entry to reporting, and of both current exposure – calibration datasets. In his book [Wickham (2009)] Elegant graphics for data analytics Wickham provides different possibilities to visualize data analytics. Knowledge graphs should start being implemented on a regular basis. PD analysis Models must perform adequately on economically significant sub-ranges of application. There should be no overlaps in the range of application of different models and clear assignment of obligors to rating systems. A meaningful risk differentiation should be ensured taking into account (i) the distribution of obligors and exposures in the grades or pools, (ii) the tools and metrics used to assess risk differentiation, and (iii) the homogeneity of obligors or exposures assigned to the same grade or pool. The appropriateness of the philosophy underlying the grade or pool assignment in terms of how B assign exposures, obligors or facilities to 'risk buckets' according to appropriate risk drivers must be analysed. The adequacy of the risk unification method for the philosophy underlying the grade or pool assignment must be assessed. Use of external data It is optimal to use internal data for the estimation of risk parameters. However, if external data is used, the same requirements concerning representation will be applicable for the bank’s portfolio. If the bank cannot sufficiently prove the representation of external data, it should show through quantitative analysis and qualitative argumentation, that information gained for the use of external data outweighs the drawbacks stemming from any deficiencies identified. The bank should provide evidence that the model's performance does not deteriorate when including information derived from the example set. Estimates are not biased. The bank should conduct quantitative and qualitative validation analysis specifically designed for the regulator. External bureau scores Regarding the use of external credit bureau scores or ratings as input variables in the rating process, close attention should be paid to situations in which externally sourced scores are the main drivers of the overall internal rating. Credit bureau scores data should be regularly updated or refreshed, especially where credit bureau information is dynamic and is used not only for the application rating but also for the on-going behavioural rating. Banks are expected to understand the structure and nature of external scores and their key drivers.

RISIKO MANAGER

 

Copyright Risiko Manager © 2004-2017. All Rights Reserved.