Data as the foundation: Why compliance and reporting will determine survival in 2026
In pillar 4 of our governance matrix, we move away from the psychological level and focus on the ‘hard’ infrastructure: the integrity of information and proactive management through compliance.
Included in this collection:
Open collection
The “magic triangle” of surveillance: why harmony puts your bank at risk

Fit and Proper 2.0: Why the human factor determines your capital requirements

Preferential Treatment of Retail Exposures in the Credit Risk Standardised Approach (CRSA) – EBA clarifies requirements regarding the granularity criterion

Artificial Intelligence in Treasury – from periodic financial reporting to a continuous management function

Changes to the LSI Stress Test 2026

CRR III and the property business: Removing the brake on new business

Early repayment penalty: Are liquidity costs the same as counterparty risk costs?1

Interview: Making of msg.ORRP

ESG Risk Management: Compliance Monitors, Internal Audit Reviews!
In our series on internal governance, we have so far analysed the structures (Pillar 1), the suitability of leadership (Pillar 2) and the dynamics of board interaction (Pillar 3). Yet even the most competent supervisory board and the most courageous executive board are doomed to fail if the foundation of their decisions is fragile. This foundation consists of data.
In Pillar 4 of our governance matrix, we move away from the psychological level and turn our attention to the ‘hard’ infrastructure: the integrity of information and proactive management through compliance.
‘Excel hell’ vs. the single source of truth
Supervisory practice in recent years has made one thing clear: the era of manual stopgap solutions is over. What used to be considered acceptable under the guise of ‘proportionality’ – such as manually consolidating risk data in complex spreadsheets – is now regarded as a massive failure of governance.
The risk of manual data processing
Every manual intermediate step in a reporting chain is a potential source of error. When data from different silos (credit, back office, treasury) is aggregated manually, ‘information artefacts’ arise. Ultimately, the board receives a report that, whilst visually professional, is based on data that is already out of date or distorted by transmission errors at the time of presentation.
BCBS 239 as a global benchmark
Although the Basel Committee’s principles for the aggregation of risk data (BCBS 239) were primarily designed for systemically important institutions, their spirit has long since found its way into general supervisory practice. The requirement for a single source of truth – a uniform, valid data source – is now a standard expectation of every institution. A bank that cannot consolidate its data automatically and consistently is considered unmanageable in the event of a crisis.
Reporting: From looking back to real-time management
A classic mistake in banking governance is a reporting system that looks exclusively in the rear-view mirror. Quarterly reports presented weeks after the reporting date are of no use for agile management.
The need for ad-hoc capability
Supervision in 2026 will not only check whether reports are available, but also how quickly the system can react to stress. Governance excellence today means that an institution must be able to aggregate valid risk positions ad hoc within the shortest possible time – whether in the event of market disruptions or geopolitical shocks.
Modern reporting is characterised by three factors:
- Granularity: The ability to drill down from aggregated metrics to individual transactions.
- Validity: A system of automated plausibility checks that detects outliers before they reach the report.
- Relevance: Moving away from the ‘data graveyard mentality’ towards decision-oriented dashboards that highlight the truly critical thresholds for the board.
Compliance: From a hindrance to a strategic guide
Just as with data quality, our understanding of compliance has undergone a radical transformation. In a functioning Pillar 4 framework, compliance is no longer the department that pulls out the ‘stop’ stamp at the end of a process, but an integral part of strategic planning.
Proactive compliance culture
Governance failures in the area of compliance often result from a purely reactive approach. Organisations wait for new regulatory requirements and then attempt to squeeze them into existing processes with minimal effort. The result is a ‘patchwork solution’ that increases complexity and reduces efficiency.
A modern compliance function acts proactively:
- Early warning system: Identifying regulatory trends (e.g. ESG reporting, crypto regulation) before they are enshrined in law.
- NPP integration: Active participation in the new product process (NPP) to ensure that innovations are built on a stable regulatory foundation from the outset.
- Cultural ambassador: Compliance is not a technocratic set of rules, but a lived integrity that is embedded within the workforce through regular training and an open whistleblowing system.
Data integrity testing: the auditor’s perspective
For the statutory auditor, Pillar 4 is the ‘battleground of evidence’. Whilst Pillar 3 relies heavily on interviews and analysis of records, data quality testing is based on rigorous IT audits (for example, in accordance with IDW PS 880).
The auditor asks critical questions regarding data lineage:
- Where does the data point originally come from?
- What transformations has it undergone?
- Who is authorised to change this data?
An institution that cannot provide a complete trail of its “data lineage” risks findings in the audit report that go far beyond IT deficiencies. These are assessed as shortcomings in the organisational structure and operational processes – a direct challenge to the assessment of management’s competence.
Economic relevance: Why poor data costs capital
One might think that data quality is purely an IT issue. But that is a mistake. It is a hard-hitting business issue. As part of the SREP process, the regulator assesses the risk arising from internal governance.
SREP surcharges and P2R
Deficiencies in reporting and data quality almost inevitably lead to a lower SREP score. This results in a higher Pillar 2 Requirement (P2R). In concrete terms, this means that the bank must hold more capital in order to be permitted to conduct the same business as a competitor with excellent data governance.
Furthermore, poor compliance leads to an increased risk of fines and reputational damage, which in turn drives up refinancing costs in the market.
Governance excellence in Pillar 4 is therefore a direct driver of returns. An automated data highway often pays for itself within a few years through the savings on capital buffer surcharges alone.
Conclusion: The decision to manage
Pillar 4 marks the end of excuses. Anyone who claims to have their organisation under control but is unable to answer questions about the data sources for their risk reports is not practising governance, but merely managing hope.
The path to the ‘data highway’ is arduous and requires investment in modern IT architectures and highly qualified compliance teams. Yet the alternative is ‘governance blindness’. In a world where regulatory requirements (MaRisk, DORA, ESG) are rising exponentially, a resilient data and compliance structure is the only safeguard against losing control.
Only when the figures are accurate, the reports are sound and compliance is managed proactively can the “living dialogue” under Pillar 3 actually lead to the right decisions.
Announcement
This in-depth exploration of the world of data and rules lays the groundwork for our final analysis.
In the next and final instalment of our series, we will focus on Pillar 5: Strategic Resilience & ESG – how to make your governance future-proof in the face of the major transformations of our time.




You must login to post a comment.