The UK government is preparing to encourage Apple and Google to introduce nudity-blocking systems at the operating-system level, preventing explicit images from appearing on smartphones unless users verify that they are adults — part of a broader child-protection strategy, according to the Financial Times.

According to the Financial Times, ministers want nudity-detection algorithms built directly into iOS and Android so that devices would block the display, capture or sharing of images containing genitalia unless age verification has been completed. Officials said the UK is not introducing a legal requirement at this stage, but the approach could become mandatory if voluntary cooperation fails to meet government expectations.

How the proposed system would work

People familiar with the discussions told the Financial Times that, under the proposal, phones would prevent any nudity from appearing on screen by default. Adults would need to verify their age using methods such as biometric checks or official identification to disable the restriction. The Home Office has initially focused on mobile devices, though the model could later be expanded to laptops and desktop computers.

The report also states that child sex offenders could be required to keep such blockers permanently enabled. Officials point to existing moderation tools as proof of concept, noting that Microsoft already scans for inappropriate content in Microsoft Teams, which they see as evidence that large technology firms can deploy similar safeguards at scale.

Why does the UK want Apple and Google to block nudity on phones by default

Legal context and international parallels

The policy push follows the introduction of the UK’s Online Safety Act, which obliges pornographic websites and certain social media platforms to verify users’ ages. While the law targets services rather than devices, ministers acknowledge that age checks can be bypassed using tools such as VPNs, prompting interest in device-level controls as a complementary measure.

Apple and Google already provide optional parental-control features, but both companies have previously raised privacy and data-protection concerns when governments sought broader age-verification mandates. In the United States, they warned of privacy risks when Texas enacted app-store age-verification rules, which were subsequently challenged in court by industry groups.

Similar debates are unfolding elsewhere. Australia has moved to restrict social media access for users under 16, prompting legal challenges, while in the UK, Imgur blocked access for British users in September amid an investigation into its age-verification practices. Apple has also faced backlash in the past, notably in 2021 when it abandoned plans to scan iPhone photos for child sexual abuse material (CSAM) following criticism over potential privacy violations.

For now, UK officials say they will encourage voluntary adoption rather than impose immediate legal obligations. However, the Financial Times reports that mandatory requirements for devices sold in the UK were considered and ruled out only “for now”, leaving open the possibility of stricter regulation if the government’s objectives are not achieved.

Read about the life of Westminster and Pimlico district, London and the world. 24/7 news with fresh and useful updates on culture, business, technology and city life: Will UK and EU travellers soon face mandatory disclosure of five years of social media to enter the United States