📈 Markets | London, Edinburgh, Cardiff

MARKET PULSE UK

Decoding Markets for Everyone


UK to require 48-hour takedown for intimate images

Tech platforms operating in the UK will be legally obliged to remove non‑consensual intimate images within 48 hours of being flagged, via an amendment to the Crime and Policing Bill announced on 19 February. Companies that fail to act face penalties of up to 10% of qualifying worldwide revenue and, in extremis, service blocking in the UK, according to the Department for Science, Innovation and Technology. (gov.uk)

Ministers are pitching a “report once, protected everywhere” model. The goal is for a single report to trigger removals across multiple platforms and to prevent future re‑uploads automatically. Ofcom is considering treating this content on a par with child sexual abuse and terrorism material, using digital markers to catch duplicates. Separately, the creation or sharing of such images will be designated a priority offence under the Online Safety Act, requiring proactive systems to curb their spread. (gov.uk)

For trust and safety leads, the 48‑hour clock turns policy intent into a fixed service level. Under the Online Safety Act, Ofcom can already compel risk assessments and impose penalties of up to 10% of qualifying worldwide revenue (or £18m, whichever is higher) and, in serious cases, disrupt or block access to services. Those levers will now sit behind a time‑bound duty to act on flagged content. (ofcom.org.uk)

The measure will be added to the Crime and Policing Bill currently in the House of Lords, with Report Stage due to begin on 25 February 2026. For product, policy and legal teams, that timetable means weeks rather than months to evidence readiness and rehearse end‑to‑end takedown flows. (bills.parliament.uk)

Rising case volumes explain the urgency. The Revenge Porn Helpline logged 22,275 reports in 2024, up 20.9% year on year-the highest on record since its launch in 2015. Separate Refuge data reported by The Guardian suggested only around 4% of police‑reported intimate image abuse cases led to a charge, underscoring why speed at platform level matters. (revengepornhelpline.org.uk)

Generative AI has intensified the problem. Following criticism from UK ministers over xAI’s Grok image tool, DSIT pressed Ofcom to use its full powers and reminded platforms that persistent non‑compliance can result in UK access being blocked. The episode showed how a single feature can create outsized legal and reputational exposure in days. (gov.uk)

Operationally, compliance means round‑the‑clock triage, explicit escalation playbooks, and auditable decision logs. Ofcom’s guidance emphasises documented risk assessments and proportionate safety measures; expect investigators to test queue times, reviewer coverage and model performance rather than accept broad assurances. Firms without 24/7 coverage will likely need to build or buy it quickly. (ofcom.org.uk)

The promised “one‑report” approach implies wider use of hash‑matching and provenance signals to stop re‑uploads. If Ofcom aligns non‑consensual intimate images with CSA/terror content, platforms will be expected to detect duplicates at scale while maintaining an appeals path to correct mistakes-a balance between speed and due process that will be closely watched. (gov.uk)

The financial stakes are significant. A platform with £10bn in global revenue faces potential exposure of up to £1bn under the 10% ceiling-before reputational harm and possible access restrictions in the UK. Boards should treat trust and safety spend as a core risk control, not a discretionary line item. (ofcom.org.uk)

Smaller forums and adult sites are not out of scope. DSIT says it will publish guidance for internet providers on blocking access to rogue sites hosting this content, aimed at operators that fall outside current Online Safety Act reach. Payment and advertising partners are likely to tighten contractual warranties in response. (gov.uk)

What happens next is tightly sequenced. The Lords take the Bill through Report Stage from 25 February; DSIT will issue ISP‑blocking guidance; meanwhile Ofcom is assessing enforcement treatment and technology standards. Companies should map these milestones to internal go‑lives, vendor contracts and user communications now. (bills.parliament.uk)

For victims, the promise is less admin and faster relief: one report should lead to removals across services and automatic blocks on future uploads. Until the duty is live, continue reporting directly to platforms and seek specialist support such as the Revenge Porn Helpline, which already works with companies on image takedowns. (gov.uk)

← Back to Articles