MRM Compliance Assessment
MRM Score
Compliance Rating: Needs Improvement
Governance & Policies
66.7%
66.7%
Documentation & Data Quality
66.7%
66.7%
Model Development & Conceptual Soundness
66.7%
66.7%
Implementation & Validation
66.7%
66.7%
Monitoring & Change Management
66.7%
66.7%
Independent Validation
66.7%
66.7%
Benchmarks & KPIs
Domain | Your Score | Industry Median | Top Quartile |
---|---|---|---|
Governance & Policies | 66.7 | 61.7 | 71.7 |
Documentation & Data Quality | 66.7 | 61.7 | 71.7 |
Model Development & Conceptual Soundness | 66.7 | 61.7 | 71.7 |
Implementation & Validation | 66.7 | 61.7 | 71.7 |
Monitoring & Change Management | 66.7 | 61.7 | 71.7 |
Independent Validation | 66.7 | 61.7 | 71.7 |
MRM Compliance Assessment
Governance & Policies
Weight: 20%Does the Board of Directors and senior management provide effective oversight of the institution's model risk, including the establishment of a formal MRM framework and a defined model risk appetite?
Has the institution established and maintained a comprehensive, centralized model inventory that serves as the primary record for all models in scope?
Are roles and responsibilities for model developers, model owners, model users, validators, and control functions clearly defined, documented, and communicated?
Does the institution have a comprehensive set of policies and procedures to formalize the MRM framework and guide the entire model lifecycle?
Does the institution's MRM framework explicitly define what constitutes a 'model' and provide a process for classifying quantitative methods that may not meet the full definition?
Does a Board-level IT Strategy Committee, with an independent director as chairperson, provide oversight of IT-related risks, including those posed by models?
Documentation & Data Quality
Weight: 15%Does the institution maintain comprehensive and up-to-date documentation for each model, covering its design, theory, assumptions, limitations, and intended use?
Does the institution's data quality management framework include formally documented policies, defined data ownership, quality metrics, and regular reporting to management?
Is the data used to build and calibrate models assessed for representativeness against the portfolio to which the model is applied?
Where external or pooled data is used, does the institution obtain sufficient information to assess its representativeness and quality?
Model Development & Conceptual Soundness
Weight: 15%Is the model's design and methodology conceptually sound, fit for its intended purpose, and supported by developmental evidence?
Are model assumptions and limitations clearly identified, documented, and assessed to ensure they are reasonable and well-informed?
Does the model development process include robust testing of data, model construct, and model outcomes to identify and monitor limitations and weaknesses?
Implementation & Validation
Weight: 20%Is the model implemented in an appropriate and well-controlled IT environment, with rigorous testing to ensure its integrity?
Does the validation function conduct a comprehensive assessment of the model's performance, including back-testing, accuracy of best estimate calibration, and other quantitative analyses?
Does the validation process include an evaluation of the quality of the model's inputs and data, including their representativeness?
Does the institution have a formal process for vetting and validating third-party/vendor models to the same standard as in-house developed models?
Does the institution have a documented Business Continuity Plan (BCP) and Disaster Recovery (DR) policy for critical models, including source code escrow agreements for key third-party applications?
Monitoring & Change Management
Weight: 15%Is there a formal, documented process for ongoing model performance monitoring to detect deterioration?
Is there a formal policy and end-to-end process for managing all changes to models, including classification, impact assessment, approval, and notification to regulators?
Is there a documented framework for the governance, validation, monitoring, and reporting of Post-Model Adjustments (PMAs)?
Independent Validation
Weight: 15%Is the model validation function structurally independent from model development and use, with sufficient stature and authority to provide credible and effective challenge?
Does the staff conducting validation possess the necessary technical expertise, skills, and business knowledge to effectively challenge the models under review?
Does the institution's Internal Audit function, as the third line of defense, independently review and assess the effectiveness of the overall MRM framework and the validation function itself?