vor 3 Jahren


  • Text
  • Banken
  • Informationen
  • Beispielsweise
  • Parameter
  • Historical
  • Blockchain
  • Schadenanzahl
  • Unternehmen
  • Risiken
  • Risiko
RISIKO MANAGER ist das führende Medium für alle Experten des Financial Risk Managements in Banken, Sparkassen und Versicherungen. Mit Themen aus den Bereichen Kreditrisiko, Marktrisiko, OpRisk, ERM und Regulierung vermittelt RISIKO MANAGER seinen Lesern hochkarätige Einschätzungen und umfassendes Wissen für fortschrittliches Risikomanagement.


8 RISIKO MANAGER 10|2019 Tab. 02 Questions to analyse the data quality framework Identify the data quality framework/policy approval body at the bank. Review evidence of first approval and subsequent amendments. Verify the existence of organisation-wide policies for data quality management and data handling. Request evidence on how the framework/policy is communicated/distributed throughout the organisation. Provide second line of defence/third line of defence reviews of the data quality framework/policy and the existence of a formal process for setting data requirements (e.g. data dictionary). Has the bank implemented data quality objectives and/or data quality standards for the different data quality dimensions? Is there a person responsible for setting, approving and monitoring the requirements? Is the overall approach to monitoring compliant with the standards/objectives currently in place for historical PD/LGD data? Is there an effective and robust data control process? Are the areas covered by controls; technical checks (formatting, processing checks) and business checks (consistency and plausibility)? What types of controls are in place: reconciliations, variance analyses, predefined checks? In particular, the bank should explain differences arising from reconciliation processes? How are controls performed: automated versus manual? What are the most important existing checks/controls from data entry through to the output of both PD and historical LGD data (including the data steps in between)? What process is in place for constantly assessing and improving data quality? Are there current procedures (workflows, intervening parties and decision-making bodies; issue identification, recommendation, mitigation measures and follow-up mechanisms) for the ongoing improvement of data quality (for both PD and historical LGD data)? What processes do we have for data quality reporting (for both PD and historical LGD data)? meaning and importance of credit risk and its impact on society. In [Bielecki and Rutkowski (2001)] Credit Risk: Modelling, Valuation, and Hedging mathematical developments are thoroughly covered and give the structural and reduced-form approaches to credit risk modelling. For the assessment of credit risk models in recent times many articles have been published (in response to the requirements of ECB). In Credit-risk-modelling, [Benzschawel (2012)] helps to understand how credit risk is measured, priced and managed. Improving Risk Analysis [see Cox (2012)] shows how to better assess and manage uncertain risks when the consequences of alternative actions are unknown. [Moges et al. (2013)] A multidimensional analysis of data quality for credit risk management – New insights and challenges address issues and suggest improvement actions in a credit risk assessment context. There are, however, very few books on inspection techniques for risk models. In this article, I will present some sophisticated methods to inspect and understand data quality management practices within the bank. Longnecker (2010)] in An introduction to statistical methods and data analysis teach how to make decisions based on data in general settings and become critical readers of statistical analyses in research papers and news reports. In The Data Warehouse toolkit [see Kimball (2013)] the author invented a data warehousing technique called dimensional modeling. For the assessment of data quality readers can find many books e.g. [Harrach, H. (2010)] Risikoassessment für Datenqualität; [Held (2016)] Datenqualität für Testdaten – Eine Nutzbarkeitsanalyse für Testdatensammlungen; [Hildebrand et al. (2008)] Datenund Informationsqualität – Auf dem Weg zur Information Excellence; and [Otto and Österle (2016)] Corporate Data Quality – Voraussetzung erfolgreicher Geschäftsmodelle. In [Batini et al. (2009)] Methodologies for data quality assessment and improvement the authors describe methodologies to compare along several dimensions, including the methodological phases and steps, the strategies and techniques, the data quality dimensions and the types of data. In the book [Pipino et al. (2002)] Data quality assessment data quality is seen as a multidimensional concept. [Wang et al. (1992)] in Data Quality Requirements Analysis and Modeling establish a set of premises and definitions for data quality management and develop a step-by-step methodology for documenting data quality parameters. Big data analysis has become more and more important as declared in according to [McGovern (2015)]: Big Data 2014 edition etc. The application in an enterprise can be found in [Dorschel (2015)]: Praxishandbuch Big data – Wirtschaft – Recht – Technik and in [Schön (2018)] Planung und Reporting Grundlagen, Business Intelligence, Mobile BI und Big-Data-Analytics. In the credit risk area, also older research can be found. [Fiedler et al. (1971)] Measures of Credit Risk and experience describes the

Kreditrisiko 9 Regulation In recent years the ECB (“the regulator”) has published different guidelines to assess the different credit risk models in banks. The following table gives an overview of the relevant regulatory guidelines. ( Tab. 01) The beginning of the inspection of the bank The inspection of the bank will follow the following stages. ( Fig. 01) To better understand data quality management practices within the bank in the beginning, the following information/documents must be provided by the bank: » organizational chart; » overview of committees and other decision-making bodies; » policies and procedures for data and data quality management (i.e. formalising roles and responsibilities and governing/decision-making bodies, guidelines/ principles and metric/indicator approach for the management of data quality, procedures/workflows for improving data quality on an ongoing basis, data quality reporting procedures, etc.) and data processing across the IRB Approach. The Group policies and procedures and specific procedures for the IRB model under review are in the scope of this item; » IT infrastructure and architecture, illustrating the IRB data flow from the source to the output; » related functional and technical documentation; » overview of the data sources feeding the databases used for the collection of historical data for modeling purposes; » data dictionary for the IRB model under review; » layouts for all relevant databases/tables employed in the IRB process; » overview of the risk drivers; » the policy containing the definition of default, functional and technical documentation on the implementation of the default definition within the bank’s systems/applications; » List of main data quality controls performed (i. e. technical controls, business controls, formatting controls, etc.). After these important sources are provided, the inspection should assess the following milestones: » the values are present in the attributes that require them ('completeness '); » the data must be substantively error-free ('accuracy'); » a given set of data can be matched across different data sources of the bank ('consistency'); » the data values are up to date ('timeliness'); » the aggregate data is free from any duplication given by filters or other transformations of source data ('uniqueness'); » the data is founded on an adequate system of classification, rigorous enough to compel acceptance ('validity'); » the history, processing, and location of data under consideration can be easily traced ('traceability'). Fig. 02 Data quality dimensions Consistency Accuracy Completeness Timeliness Uniqueness Availability Traceability Validity

Erfolgreich kopiert!



Copyright Risiko Manager © 2004-2017. All Rights Reserved.