In this article, we take a look at the draft adequacy decision for the UK which has been issued by the European Commission (Commission) and the changes which the Data (Use and Access) Act 2025 (DUA Act) introduces to the rules around automated decision-making, as well as the potential impact this may have on the UK’s adequacy status in the future.
UK Draft Adequacy Decision
On 22 July 2025, the Commission published a draft adequacy decision for the UK under the General Data Protection Regulation ((EU 2016/679)) (EU GDPR) and started the process to adopt the new adequacy decision to provide for the continued free flow of personal data between the European Economic Area (EEA) and the UK.
This is necessary because the EU’s original post-Brexit adequacy decision for the UK lasted only until 27 June 2025. The Commission adopted a technical extension for a limited period of six months to allow it to conduct an assessment of the DUA Act and assess the adequacy of the UK’s reformed data protection legal framework.
Data (Use and Access) Act 2025
The DUA Act received Royal Assent on 19 June 2025 and introduces a number of reforms to UK data protection law through amendments to the UK General Data Protection Regulation (UK GDPR), the Data Protection Act 2018 (DPA 2018) and the Privacy and Electronic Communications (EC Directive) Regulations 2003 (for more information on the key changes introduced by the DUA Act please see our previous article here). However these amendments have ended up being more subtle than those envisaged when the previous government attempted to introduce data protection reform with the Data Protection and Digital Information Bill (DPDI). Whilst the DPDI would have introduced more radical changes to the UK data protection regime, this came with the clear risk that the Commission would not accept such a deviation from EU GDPR standards and could have cost the UK its adequacy decision resulting in a significant compliance challenges for businesses that rely on the UK adequacy decision for the free flow of personal data between the UK and the EEA.
The Commission’s draft decision indicates that the DUA Act has successfully managed to strike the delicate balance between introducing reform and maintaining adequacy with the EU. Prior to the decision, there were some concerns that the introduction of new rules on a series of issues (including automated decision-making) could jeopardise the UK’s adequacy status. However the Commission’s assessment of the changes is that they continue to provide data protection safeguards that are essentially equivalent to those provided by the EU.
Before the draft adequacy decision can be adopted, it will go to the European Data Protection Board (EDPB) for its opinion, and although the EDPB’s opinion is non-binding it is highly influential. The draft decision must then obtain approval from a committee of representatives of the EU Member States, and the European Parliament also retains a right to scrutinise the draft decision. If adopted, the draft decision will apply for a period of six years, expiring on 27 December 2031, with the potential for a further four-year extension subject to periodic reviews of the adequacy of the level of protection ensured in the UK.
Automated decision-making (general)
One of the most notable changes introduced by the DUA Act is the relaxation of rules around automated decision-making. Under the existing regime, it is not possible to make a decision having legal or similarly significant effects if that decision is based solely on automated processing, including profiling. The only exceptions to this are where:
- the data subject has provided their explicit consent;
- the processing is necessary for the purposes of a contract between a person and an organisation; or
- the processing is permitted by UK law (subject to appropriate safeguards).
The DUA Act reforms the existing framework and broadens the scope for automated decision making. The previous restrictions have been modified such that organisations will be able to rely on any of the relevant lawful bases for processing if they want to make significant automated decisions about individuals, as long as the data involved is not special category personal data and the organisation has the appropriate safeguards in place.
Automated decision-making (special category data)
The DUA Act provides that automated decisions cannot be based on special category data unless:
- the data subject has provided their explicit consent;
- such processing is necessary for the purposes of a contract; or
- the processing is authorised by law.
In addition, where an organisation wants to rely on exception 2 or 3 above, the processing must also be necessary for reasons of substantial public interest and there must be suitable measures to safeguard the data subject’s rights, freedoms and legitimate interests in place.
Appropriate safeguards for the automated decision making based on personal data (not just sensitive personal data) include:
- providing the data subject with information about the decisions taken in relation to them;
- enabling the data subject to make representations about such decisions;
- enabling the data subject to obtain human intervention from the controller in relation to such decisions; and
- enabling the data subject to contest such decisions.
The DUA Act clarifies that automated decision-making is a decision based solely on automated processing if there is no meaningful human involvement in the taking of the decision and a decision is significant if it produces a legal or similarly significant effect for the individual. When deciding whether there is meaningful human involvement in a decision, an organisation must consider the extent to which the decision is based on profiling.
Additionally, the DUA Act grants the UK government authority to produce secondary legislation to expand on the concepts of meaningful human involvement and ‘similarly significant effects’ as well as to introduce new safeguards, impose requirements supplementing existing safeguards and define measures that do not satisfy the safeguards.
It is not yet confirmed when these changes will take effect, however it is likely form part of stage three of commencement of the DUA Act which is expected to be approximately 6 months after 19 June 2025 (the date of Royal Assent).
Effect of the changes on automated decision-making and possible future direction
The simplification of the requirements around automated decision-making means such decision-making can now be carried out in wider circumstances such as for performance of a contract or legitimate interests (subject to the restrictions mentioned above). These changes are designed to provide more flexibility around the use of AI tools for organisations. This is in line with the general UK approach to AI regulation which seeks to adopt a principles-based approach to facilitate innovation.
The Commission’s assessment of the changes is that although they modify the framework for automated decision-making in the UK, such decision making ultimately remains subject to key safeguards requiring the right to obtain human intervention.
The Commission further noted that the updated rules on automated decision-making in the UK GDPR are unlikely to have any significant effect on personal data transferred from the EEA to the UK as such personal data is likely to have been collected in the EEA and any decisions based on automated processing will most likely be taken by an EEA-based controller that has a direct relationship with the relevant data subject and will therefore be subject to the EU GDPR.
Whilst the new UK approach to automated decision-making did not result in a revocation or refusal to renew the UK adequacy decision from the Commission, it is notable that in its assessment of the changes the Commission focussed on the fact that the changes are unlikely to have any significant effect on personal data transferred from the EEA to the UK, rather than the nature of the changes themselves.
This could be an indication of a diverging approach to automated decision-making between the EU and UK, particularly in light of the first ruling on automated decision-making by the Court of Justice of the European Union (CJEU). That case concerned SCHUFA, a German credit reference agency, which carried out automated processing of data to generate probability values for the repayment of loans, and then passed on the scores to third parties (such as banks) who would rely on them to make decisions (such as whether or not to grant a loan). The CJEU held that this amounted to automated decision-making and found that it had a legal or similarly significant effect for the individual as the probability value played a determining role in whether credit was granted.
The CJEU referred the matter back to the German courts without opining on whether one of the exemptions to the prohibition on automated decision-making applied in this instance (explicit consent, necessary for performance of a contract with the data subject or permitted under EU or national law). However, the CJEU’s analysis of what constitutes relevant automated decision-making under the EU GDPR is interesting. The credit reference agency producing the credit score did not make the ultimate decision about the individual. Rather, it was the third party (bank) which was supplied with the credit score who made that decision – yet the CJEU found that the production of the credit score was enough to fall within the scope of Article 22 of the EU GDPR i.e. that it was a decision based solely on automated processing which produces legal or similarly significant effects on the data subject.
In contrast, under the UK data protection framework post-DUA Act, if the data used by SCHUFA did not contain any special category personal data, there would theoretically be no need for SCHUFA to rely on the exemptions of consent, performance of a contract or permitted under relevant law. Instead, a similar UK-based credit reference agency that is only subject to the UK data protection regime could potentially rely on legitimate interest for such processing, provided that it has appropriate safeguards in place.
This illustrates a potential divergence in the respective approaches of the EU and the UK to automated decision-making and whilst it does not seem to have negatively impacted the UK’s adequacy status for now, further divergence and developments may do so.
This is a crucial time in the data protection space as legislation seeks to maintain pace with technological developments. Businesses face numerous opportunities in this area but must also keep up with new and evolving legislative and regulatory requirements.
If you have any questions arising from this article or would like any assistance in ensuring that your data protection practices are up to date, please contact FSP’s commercial & technology team at [email protected]

