TRIGGER WARNING This website contains references to sexual violence.
There are millions of images and videos of illegal sexual content on the open web.
This content is unregulated, and easily accessed. And it’s having a significant impact on our rangatahi and communities.
We are advocating for 3 immediate actions to address illegal sexual abuse content online.
-
IMPROVE THE FILTER - ACHIEVED!
In 2021/2022 we called on the Government to improve and update their Digital Child Exploitation Filter System which filters ‘child sex abuse material’ (CSAM) videos online by adopting the IWF block list.
We are pleased to announce the Minister of Internal Affairs has agreed to this action in June 2024! Read here.
Background:
The Government has an existing filter for Child Sex Abuse Material, which was established in 2010, by the Dept. of Internal Affairs. The DCEFS filter has developed a block list, with known CSAM URL’s collated through investigations and INTERPOL. The block list has just 400-700 URL’s containing CSAM.The Internet Watch Foundation (IWF) provides a list, updated daily, that has approx. 5000 URLs each day. The DIA could use this list, along with Internet Providers, but chooses not to.
The technology of the filter is dated, and resourcing at the DIA is limited, resulting in a situation where illegal sexual content online is increasing rapidly, with little to no protections in place for our children, as they browse the web.
-
EXPAND THE BLOCK LIST
The DCEFS filter only focuses on blocking CSAM content.
There are currently no requirements on ISP’s to filter illegal content beyond CSAM. Rape or bestiality content is readily available online for children to see. This is not ok.
We want additional lists to be compiled to include bestiality and rape content. The creation of these lists would place NZ as pioneers in digital child protection. The technology is capable of filtering more genres, genres that normalise and eroticise sexual violence. We need to do more.
This is not general porn censorship, this campaign is focused on illegal sexual content which is considered objectionable under section 3 of the Films, Videos, Publications Classifications Act 1993.
PROGRESS UPDATE: Our petition has been presented to Members of Parliament in April 2024. We are presenting our oral submission to Select Committee in November 2024.
-
BE PREVENTATIVE
It’s no surprise that current and potential sex offenders seek illegal material like ‘child sex abuse material’ (CSAM) online. We believe that by interacting with this demographic through targeted online marketing tools, we can direct them toward help. Providing a pathway offering help for offenders could protect children/victims in the future — it’s simple and it makes sense.
Automated Search Result Banners (information pop-ups) exist for CSAM, however they simply state what punitive action will be taken against those seeking the content and do NOT point them to services to help manage their behaviour. We want an expansion of these banners, and the other illegal search terms mentioned to also have automated banners. We believe this is one simple way to redirect those looking for this content to services/resources to get help for this problem.
Search terms that could be targeted include “How do I rape someone?”, “I want to rape someone”, “kids sex”. There is a significant gap in this area. Redirection to get help is a simple sexual violence prevention strategy that protects those vulnerable to being harmed - prevention makes sense.
PROGRESS UPDATE: We are talking directly with Google, Microsoft and social media companies to make these changes. Will keep you posted.
For too long, extreme sexual content has been unregulated and easily accessible.
Child protection makes sense.
The digital landscape has evolved rapidly in the past 20 years with free access to websites, videos, images and more.
Some of this free online content is beneficial. And some of it is illegal and harmful.
As the internet has grown, a plethora of content has become available. Unfortunately, with this content has come some sites that exhibit illegal sexual behaviours for example; rape, abuse, and bestiality.
Many assume this illegal sexual content is being filtered out by Government or Internet Service Providers, or be difficult for children to see. Unfortunately, this is not the case.
From a simple online search we have found live links to the following:
"Rape sex porn” gives 14.6 million results. In November 2024, Bing showed 3.5million results for “rape”.
“Bestiality porn" 143 million* results (November results up from 39million in April 2023)
There has also been a 1058% increase in known sexual images of children aged 7-10 online since 2019.
Of the young people in NZ who will see explicit sexual content, 25% will see it at age 12 or younger. 73% of all young people exposed state they have seen content that makes them feel "uncomfortable”, and 71% stated they want more filters and regulation of online sexual content.
In fact, latest research shows that “more than half of teen respondents said they had seen violent and/or aggressive pornography, including media that depicts what appears to be rape, choking, or someone in pain”.
It is clear that this is affecting our most vulnerable community members. We propose treating child protection from online harm as seriously as if they were to see sexual harm on the street – it just makes sense. We know many parents care deeply about this issue. We hope to raise awareness about the gaps in New Zealand’s digital media regulation and call for change. If we don’t act now, another generation of our children and young people will be impacted by an online environment that normalises and eroticizes sexual violence.
Protecting our kids online makes sense.
FAQs
-
This campaign isn’t about general porn censorship. We want ISP’s to filter illegal sexual behaviour, like child sex abuse, rape and beastiality.
This behaviour is illegal in real life, and should not be distributed online for mass consumption. It has become normalised online, and is completely unregulated. The DIA has a filter for CSAM which ISP’s can voluntarily use but don’t have to. There is no existing filter for the other genres. -
Very common.
The internet provides free access to illegal and harmful content, a simple google search for ‘rape porn’ gives 532 million results, bestiality 40 million, and ‘slavery porn’ has 50 million results.
These numbers change on a daily basis based on what has been removed, or added over time. This content is easy to access, with 72% of young people in NZ stating they had seen content that had made them feel uncomfortable.
Our young people are increasingly learning about sex online, and it’s not healthy, or safe.
-
We recommend using effective filtering software for both your home WIFI and individual devices.
The Good Source offers a family-focussed wifi service that filters unsafe content. Safe Surfer is a NZ based filtering service that is regularly improving their technology, staying up to date with changes in sexual content, and offering caregivers reports on what young people may have been exposed to.However, we believe the best tool you can give your kids is to develop a strong internal filter. This starts with open communication, talking to them about sexual content so they are aware of what it is and how to deal with it if they do come across it. For evidence based and professional help on this go to www.thelightproject.co.nz
-
We understand concerns about Government control, and that the measures we have recommended can be perceived as a “slippery slope” into online censorship.
However, there are some behaviours that aren’t about censorship, they are about protecting vulnerable people from harm. When content showcases illegal sexual behaviour, like rape and beasiality as normalised sexual behaviour, this isn’t ok. -
Quotes from the DIA regarding their existing protective measures:
“The Department of Internal Affairs (DIA) uses a Digital Child Exploitation Filtering System which has a very narrow purpose. It blocks access to known websites that contain child sexual abuse material”.
“It is one of the Department’s measured responses to community expectations that the government and internet service providers (ISPs) should do more to provide a safe internet environment. It is designed to assist in combating the trade in child sexual abuse material by making it more difficult for persons with a sexual interest in children to access that material. The filtering system complements the information, education and enforcement activity undertaken by the Digital Child Exploitation Team of the Department of Internal Affairs”.
“The Department is working in partnership with New Zealand ISPs and offering them a choice to protect their customers from accessing these illegal websites inadvertently or otherwise”.
“It is not a magic bullet that will prevent everyone from accessing any sites that might contain images of child sexual abuse”. Read more on the DIA website here
-
“Child sexual abuse material or CSAM is the permanent recording of the sexual abuse or exploitation of a person under the age of 18. This can include images, video or live streamed content.
Real children are abused and often their suffering is not shown in the content.
The term “child pornography/porn” is sometimes used to refer to CSAM. Netsafe and many other agencies use the term Child Sexual Abuse Material (CSAM) because it better reflects what this content represents and the seriousness with which this content should be considered.”
https://netsafe.org.nz/csam-law/
OUR MISSION - THE WHY?
Children and young people who see illegal and violent sexual content online, can experience short-term trauma and long-term impacts, it can shape their expectations and beliefs about sex and relationships. This content, easily accessed online, normalises violence against women, children and animals. Given New Zealand’s current rates of sexual abuse against women and children, it is imperative we take a preventative approach.
We believe prevention should include creating a safer digital landscape for our children and young people.
WIDER CONTEXT
Digital media has evolved significantly in the past 20 years; this requires a new approach to regulation that sees the issues through the lens of our most vulnerable – our tamariki and rangatahi. The internet provides free access to illegal and harmful content, the NZ Classifications Office has found 46% of popular content New Zealanders engaged with had incestual themes, 35% showed non-consensual behaviour, and 69% of young people 14-17yo exposed to porn had seen violent or aggressive content.
Currently, videos of rape and bestiality have no barriers to youth or child access. If any of us, as parents and caregivers, knew children could see this behaviour out on the street as they walked to school, we would do anything to ensure they were protected from it. We propose treating child protection from online harm as seriously as if they were to see sexual harm on the street – it makes sense.
In this unregulated internet landscape, whānau and caregivers would need to be educated on digital harm, be competent with technology, have strong English literacy skills and the personal motivation to prevent online harm by implementing filtering software onto household devices. This is not realistic for all households, leaving children protected in some homes and vulnerable in others; an inequitable outcome.
The Internet has allowed new avenues for support and change that didn’t exist historically, but has also provided some significant challenges for our communities.
With potential or current offenders seeking illegal content like child sex abuse material (CSAM), and current and future perpetrators potentially searching online for help with their offending, we can easily direct them to services/resources through automated banners. Currently, if someone searches for “I want to rape someone” a range of services come up for survivors of sexual abuse, but no prompts to seek help. This is a significant gap - redirection to help is a simple sexual violence prevention strategy, to protect those vulnerable to being harmed - prevention makes sense.
We want to raise awareness about these gaps in our sexual violence prevention approach and digital media regulation in Aotearoa, New Zealand.
If we don’t act now, another generation of children and young people will be impacted by a media environment that normalises sexual violence. Taking action just makes sense.