Any tech company that handles personal data must keep abreast of privacy law. In June 2025 the UK's Data (Use and Access) Act received Royal Assent, creating a statutory basis for Digital Verification Services (DVS). This law is part of a broader reform package that includes the forthcoming Data Protection and Digital Information Bill, designed to amend the UK GDPR and reduce friction for research and innovation. For founders in 2026, the message is clear: privacy rules are evolving, and they are converging with digital identity frameworks.
Digital verification services and trust marks
The Data (Use and Access) Act aims to ensure that people and businesses can access trusted digital identity services across the economy. Part 2 of the Act creates a legislative foundation for DVS, requiring the government to maintain a statutory register of digital verification service providers, consult on a trust framework and issue an official UK digital identity trust mark. Public authorities will be able to share information with registered providers to streamline verification. The goal is to make processes such as renting a flat, proving age or opening a bank account faster and more secure. For startups, integrating DVS can reduce onboarding time and fraud, but it introduces dependencies on government-approved providers and may restrict the ability to tailor identity checks.
There is also a timing challenge: although the Act has passed, most measures require a commencement order to come into force. Early adopters will need to track secondary legislation and be ready to adjust integration plans. When the regime is live, failure to use certified providers could raise questions about due diligence, especially in regulated sectors such as finance.
The Data Protection and Digital Information Bill
The Data Protection and Digital Information Bill - sometimes dubbed 'UK GDPR 2.0' - is still wending its way through Parliament. It seeks to reduce compliance burdens by clarifying legitimate interest, easing research data reuse and simplifying cookie consent. It also introduces new concepts such as 'smart data' schemes to facilitate data portability across sectors. Critics worry that the reforms could lower standards; supporters argue they will make the UK more attractive for data-driven innovation. Whatever the final outcome, startups should prepare for changes to automated decision-making rules, the role of data protection officers and cross-border data transfer requirements. The government's delay of a bespoke AI Act until mid-2026 means these reforms will interact with EU AI obligations and ISO 42001 guidelines.
Balancing personalisation and privacy
Data-driven products thrive on personalisation, but collecting and processing personal data incurs legal and reputational risks. Under UK GDPR, processing requires a lawful basis; under the new Bill some direct marketing could rely on legitimate interest rather than consent. Transparency remains paramount: privacy notices should explain what data is collected, for what purposes and with whom it will be shared. Fintechs building identity-based services must be clear about how they use DVS, while AI startups must document training data provenance and model outputs. Users are increasingly privacy-aware: 59% of Britons have concerns about dependence on AI and its implications.
Cross-border considerations
For companies operating across the EU, compliance with both UK and EU rules is essential. The EU-UK data adequacy agreement currently allows data flows, but divergence could put this at risk. The EU's Digital Operational Resilience Act and upcoming Financial Data Access framework will introduce additional data-sharing obligations and resilience requirements for financial firms. Meanwhile, the UK is aligning with EU rules on payments and open finance, which means similar standards may apply in both jurisdictions. Startups should build privacy and data governance frameworks that can adapt to multiple regimes. Documenting data flows, conducting privacy impact assessments and adopting industry standards such as ISO 27001 and ISO 42001 can help demonstrate compliance.
Ethical considerations and user trust
Beyond legal requirements, there is an ethical dimension to data governance. Using digital identity services can exclude those without access to smartphones or stable internet, so founders should provide alternative onboarding options. Automated decision-making can entrench bias if models are not properly governed. Responsible AI principles require human oversight, transparency and fairness. Building privacy and ethics into product design from the start will not only reduce regulatory risk but also build brand trust in a marketplace where consumers are increasingly sceptical.
Closing thoughts
The UK's evolving data and digital identity laws herald both opportunity and complexity. Integrating certified digital verification services can streamline user journeys and reduce fraud, while reforms to data protection could ease innovation. However, these benefits come with new obligations to keep data secure, respect user rights and align with EU and international standards. For tech founders in 2026, treating privacy and data governance as core product features rather than afterthoughts is the surest way to build a sustainable business.