March 14 2025
The Independent Pornography Review by Baroness Bertin - an assessment of the legislation, regulation and enforcement of pornography - has just been published. It is a sobering read, with a total of 32 wide ranging recommendations being made. Many of the recommendations are ‘big picture’ or focussed on pornography platforms themselves, but I am more interested in the ones proposing new legislation that may lead to offences committed by the end users.
It is a bizarre quirk of the law that online content is available that is banned offline (by way of the BBFC refusing to classify it, rather than the content itself being illegal). It introduces the concept of ‘legal but harmful’ content. We must be careful to respect free speech and not legislate purely because of our own views over something being distasteful, but there is a growing body of evidence that certain types of content have a pronounced effect on people’s attitudes, making a strong argument in support of making it illegal. Baroness Bertin proposes areas such as material likely to encourage an interest in sexually abusive activity, the infliction of pain or acts which are likely to cause serious physical harm, penetration by any object likely to cause physical harm, sexual threats and racist content. I would argue some of these are already covered by existing extreme pornography legislation (Criminal Justice and Immigration Act 2008, Section 63(7)(b), “an act which results, or is likely to result, in serious injury to a person's anus, breasts or genitals”), and the scope of what ‘legal but harmful’ content should be banned would need to be very carefully considered and legislated. For example, how do you define exactly what constitutes a sexual threat so that it can be unambiguously assessed by a Court?
Another part of this recommendation suggests requiring (via a Safe Pornography Code of Practice rather than specific legislation) “key-word monitoring, for example deterrence messaging for problematic search terms including “girl”, “young”, “rape”, “drunk” etc.” These are all very prevalent themes in online pornography, meaning action is definitely needed if usage of these terms are shown to be harmful. Similar deterrence mechanisms (for more extreme terms) are already in place on major search engines, sign-posting the user to support services, so this could be an effective measure if appropriately applied. The word “teen” is recognised in the report as particularly problematic, and this is an issue that we commonly encounter in our work surrounding Child Abuse Imagery. The term is very ambiguous as to whether it refers to a child or an adult over the age of 18, but it is widely used in the pornography industry to refer to young adults. The ambiguity over whether such a term may return unlawful content is not helpful, so action to reduce usage of the word “teen” in lawful contexts would be a big improvement.
In principle, we support this recommendation, but great care must be taken to ensure that a) the focus is on harmful, rather than merely distasteful, content, and b) that the wording of any legislation is defined in a way so that the content can be unambiguously assessed.
The act of choking can be incredibly dangerous, and its online prevalence is said to have established it as a sexual norm. Non-fatal strangulation (NFS) is already an offence introduced by the Domestic Abuse Act 2021 (albeit excluding consensual acts that do not cause or intend to cause serious harm). It is also suggested by some that NFS is already covered by Extreme Pornography legislation (Criminal Justice and Immigration Act 2008, Section 63(7)(a)), but whether the actions portray “an act which threatens a person's life” is a very ambiguous line. There is, of course, the counter-argument as to whether actions that do not result in harm between two consenting adults should be illegal, regardless of how distasteful it may be, and the need for this balance is recognised by Baroness Bertin throughout her report.
Given the potential harm caused by normalising NFS, we support this recommendation. Existing legislation is ambiguous as to whether NFS is illegal or not – new legislation to clarify this would be an improvement.
Similar to the previous recommendation, incest pornography is a very common theme online. Most often, this is staged (ie not actually incest) between consenting adults. However distasteful, we must assess this category of material objectively based upon the harm caused. Baroness Bertin does recognise that the evidence of harm caused by this is very limited at the current time, perhaps making a recommendation to make it illegal premature.
The lack of evidence surrounding the potential harm this theme actually causes means that we do not support this recommendation at the current time. A recommendation to study the harm in more detail may be more appropriate.
Already announced to be introduced as new offences under the Crime and Policing Bill later this year, it would seem entirely appropriate that generating such an image without consent should be an offence. It is particularly relevant (as we discuss here) in relation to AI-generated “deepfakes”, which are quickly becoming established online.
We fully support this recommendation, as non-consensual intimate images are not appropriate in any form, and the rise of AI-generated images only makes action in this area more pressing.
Following on from the previous recommendation, these apps use AI to generate a nude photograph of the subject. Whilst the previous recommendation would ban the creation of such images without consent, the ability to use them with consent would mean such apps remained available and very easily abused to create images without consent. A complete ban would therefore ensure the technology was not readily available given the potential for misuse likely outweighs any legitimate use.
We fully support this recommendation, as the scope for misuse of these tools far outweighs the benefits of any legitimate use.
This is the big one – any legislation is only as effective as the enforcement of it. Whilst Child Abuse Imagery has defined reporting channels worldwide, the same does not exist for extreme pornography (given it only applies in the UK). It may be a little unfair to characterise enforcement of extreme pornography laws as ineffective (we regularly see charges under this legislation), but improvements in reporting mechanisms would make a big difference to how it is policed.
The report also references “difficulties meeting the evidential threshold for charging in reference to the possession of extreme (illegal) pornography” - the threshold for merely distasteful content becoming illegal should arguably be quite a high bar, and the legislation is carefully worded in terms of its scope. Difficulties in meeting an evidential threshold for charging, therefore, may simply reflect legislation that is appropriately weighted between harms and freedom of speech. The continuing growth of legislation in this area must focus on actual harm rather than whether we find something distasteful, to avoid accusations of becoming a ‘nanny-state’ – but equally we cannot be blind to the impact that such material can have on people. Legislation covering pornography of various types is, however, becoming increasingly fragmented, and a legislative review that brings all of these offences together could be of great benefit.
In principle we support this recommendation – a review into the complex web of legislation could make the law much simpler and easier to enforce, but as discussed above we must ensure this stays focussed on actual harms.
In summary, the report has made some strong recommendations that will go some way to addressing potential harms, although as we discuss above we must be careful to avoid the pitfalls that come with it. The government has published a brief response to the report, recognising the challenges set out and promising a fuller review of the recommendations made. No doubt legislative changes in this area are coming, and it will be very interesting to see how (and if) the government decides to address the issues raised.