Your chatbot
answers.
Who governs it?
The EU AI Act is in force. Certain AI systems used in credit, insurance, and employment contexts may qualify as high-risk, triggering significant compliance obligations. Boards and senior management face governance and supervisory exposure if AI use is not properly structured. I help leadership teams build defensible AI governance — before enforcement begins.
2025
2025
2026
2027
Governance does not keep pace
with AI-driven decisions.
The figures below are drawn from the EU AI Act statutory text, DORA, and published research. They are provided for orientation, not as legal benchmarks applicable to any specific organisation.
All regulatory figures refer to statutory maximum ranges. Actual enforcement depends on national supervisory authority, facts of case, and mitigating factors. Survey data (Kiteworks) reflects self-reported figures from a private study population and should be read as indicative, not as authoritative regulatory data.
What applies now.
What comes next.
Banks. Insurers.
Asset managers.
Governance exposure is real.
Certain AI systems deployed in financial services may qualify as high-risk under Annex III of the EU AI Act, and financial institutions are simultaneously subject to DORA. The interaction of both regimes requires coordinated governance — not sequential compliance.
Three ways toward
a defensible position.
- Identifies where governance accountability actually sits
- Assesses whether decisions can be assigned, traced, and challenged
- Maps structural exposure before board-level escalation
- Produces a written report for executive and board use
- Does not replace legal counsel — complements it
- EU AI Act scope and risk classification analysis (Annex III)
- DORA applicability and ICT risk integration check
- Audit trail and traceability assessment
- Human oversight mechanisms review
- Prioritised remediation plan with deadline mapping
- Tailored to supervisory boards, C-suite, and owners
- Based on the Control–Liability Paradox paper (2026)
- English and German
- In-house or conference format
No frameworks. No generic audits.
Focused, contextual analysis.
For leadership that does not want
to discover governance gaps
after the fact.
My work begins where accountability becomes personal.
Not at the system level. At the level of the person who approved it,
who signed off on it, who was responsible for overseeing it.
I do not work with organisations looking for a compliance checkbox.
I work with boards and executive teams who need to genuinely understand
whether their AI governance is structurally sound — and whether
the decisions attached to it can be defended if challenged.
Built the frameworks. Written the research. Defending the positions.
“AI governance began to matter when accountability crossed the threshold from systems to people. This threshold defines where my work begins.”
— Patrick UpmannAssess your governance
position before
enforcement does.
The EU AI Act is in force. The high-risk deadline is 2 August 2026 — the operative statutory date. Enforcement follows. The question is whether your governance structure is stronger than the exposure attached to it.
A Board AI Clearance review typically involves an initial conversation followed by a structured assessment and written report. It does not constitute legal advice and does not replace qualified legal counsel in your jurisdiction.