ICO ADM Code: What UK SMEs Must Do Now
In March 2026 the ICO published an AI and biometrics strategy update confirming it is now developing a statutory code of practice on AI and Automated Decision-Making (ADM), with recruitment AI as the named first target. A public consultation on draft ADM guidance is open until 29 May 2026. UK SMEs that use AI in any decision affecting people should treat this as a procurement and governance moment, not just a compliance update.
Here is what the ICO has actually committed to, what it expects from organisations using ADM today, and what to do over the next twelve weeks.
Why this is happening now
The trigger is the Data (Use and Access) Act 2025 (DUAA). The Act loosens the previous near-blanket UK GDPR ban on solely automated decisions with legal or similarly significant effects, and it requires the ICO to produce a statutory code of practice that sets the safeguards in detail. The Government is currently finalising the secondary legislation that will commission the code formally, and the ICO has signalled it intends to lead with recruitment because that is where complaints, evidence, and harm patterns are clearest.
The ICO has not been waiting for the legislation. Between March 2025 and January 2026 it engaged voluntarily with over 30 employers using ADM in hiring, and it has now published its findings as the Recruitment Rewired report. The draft guidance currently in consultation will inform the eventual code. If you wait for the code to land, you have already missed the chance to shape it and to align procurement before competitors do.
What "significant decisions" actually means
DUAA introduces the concept of significant decisions: automated decisions that have legal or similarly significant effects on individuals. Government guidance on the Act gives clear examples that catch most SMEs:
- Automatically shortlisting or rejecting job applicants
- Approving or denying credit, insurance, or essential services
- Determining eligibility for welfare or benefits
- Dynamic pricing that materially affects what someone pays
- Performance scoring that affects pay, retention, or progression
If your AI is making one of those calls without meaningful human involvement, the new ADM regime applies. Meaningful human involvement is not a rubber stamp. Previous ICO guidance has been clear that a reviewer must be able to change the outcome and must engage with the actual evidence, not the system's recommendation alone.
What the ICO expects from employers using ADM
The draft guidance and the Recruitment Rewired findings put four obligations on the table for organisations using ADM in hiring. They generalise to any high-stakes decision context.
Proactive bias testing. The ICO expects regular testing for biased outputs across protected characteristics, with evidence of mitigation when bias is found. Monthly reviews are flagged as a reasonable cadence for active systems. This is not a one-off pre-deployment check.
Vendor-side bias evidence. When procuring ADM tools, you must ask the developer about their own bias testing methodology and results, and you must be able to evidence that question and answer. "We trusted the vendor" is not a defence.
Candidate transparency. People subject to an automated decision must be told it is automated, told how it works at an appropriate level, and given a clear route to challenge it and request a human review. The right to contest is a procedural obligation, not a customer-service nicety.
Accountability documentation. A named owner, a documented decision rationale, and an audit trail covering the model, the data, and the human review path. The ICO has been clear that fragmented ownership across HR, IT, and the vendor is the most common failure pattern.
Six things UK SMEs should do in the next twelve weeks
The consultation closes 29 May 2026 and the code itself will follow the secondary legislation. The window to get ahead of it is now.
1. Inventory every automated decision in your business. Include third-party tools that screen, score, or rank people: applicant tracking systems with AI shortlisting, credit decisioning APIs, dynamic pricing engines, performance management dashboards. Embedded ADM is the most commonly missed.
2. Classify each decision as significant or not. Apply the DUAA test honestly. Anything affecting employment, money, services, or reputation is in scope. Borderline cases should be treated as in scope until you can prove otherwise.
3. Audit human review. For each significant decision, document who reviews, what they see, what they can change, and how often they actually overturn the model. If reviewers approve more than 95% of model outputs, the human involvement is probably not meaningful.
4. Run a bias check now. Even an informal review of recent decisions against protected characteristics is a step forward. Document the methodology and the result. If you do not have the data to run the check, that is itself a finding.
5. Update vendor contracts. Add bias testing evidence requirements, model change notifications, audit access, and explainability commitments. The ICO will expect to see these in your supply chain when the code lands.
6. Respond to the consultation. SMEs are under-represented in ICO consultations and the code will reflect the voices that show up. The consultation form is on the ICO site and closes 29 May 2026.
Where this fits with everything else
The ADM code is one strand in a thickening UK governance fabric. It sits alongside the UK's five AI regulatory principles, which the ICO is the lead implementer of for personal data, and complements the management-system view of ISO 42001. If you are caught by the EU AI Act, high-risk systems include most recruitment and HR AI, and the documentation overlap is substantial. Build the inventory and bias testing once and it satisfies several regimes.
The voluntary phase of UK AI regulation is closing. The ICO is moving from guidance to statutory code, and procurement teams in regulated sectors will start asking for ADM evidence as soon as the code is published. SMEs that get the inventory, bias testing, and contestability right now will not need to scramble later.
How We Can Help You Prepare
If you would like a structured review of your ADM exposure, vendor evidence, and human review processes against the draft ICO guidance, our AI readiness assessment gives you a clear gap analysis and prioritised next steps. For deeper requirements work on AI in hiring or other decision systems, our AI business analysis service handles the documentation and acceptance criteria the ICO will expect to see. Book a free consultation to talk through your obligations.