#NOHARMWARE
LET'S BUILD MORE INCLUSIVE AND SAFER ONLINE SPACES IN AOTEAROA

QUICK NAVIGATION
· · · What is digital citizenship? ·
ABOUT THE CAMPAIGN
The rise of the internet has opened up incredible possibilities, allowing people to connect instantly and globally. However, without proper regulations, we've witnessed the growth of digital platforms that can create harmful online environments. Issues like online bullying, racism, sexism, doxxing, death threats, revenge porn, live-streamed terrorism, deep fakes, and complex financial scams are all too familiar to us.
But it doesn't have to remain this way.
We urge the Government to implement basic regulations that ensure social media platforms and search engines are designed with user safety in mind. This can be achieved through new transparency and safety measures overseen by an independent media regulator.
The key is that this regulation is focused on tech companies because we need changes in how systems are designed. We need clear requirements for transparency and accountability to ensure that these platforms are safe, just as we expect for other products we use.
The time to act is now, especially as Minister Erica Stanford is considering potential actions in this area.
This isn’t the end. But it will be a powerful start. It will be the beginning of the transformation of online platforms into a positive force in our lives, which lives up to the potential we saw when the platforms first came into being.
Places where people can meaningfully connect, where we can share, where we can deepen the bonds between us, where we can care for others and be cared for, where we can flourish.

THE ACTION
So get amongst it. Share information with your friends and whānau, meet with your MP, talk to your teachers, students, parents, grandparents, workmates – this is an issue that everyone can get behind. Let’s be the people power movement that shows them we won’t be silent until what our digital experiences are healthy, safe and inclusive.
What can you do:
- Meet with your local MP to discuss your views on online harm and why we need legislative change to protect people.
- Write to Minister Stanford on your concerns with our current online technology space, reinforcing why we need legislative change.
- Chat with those around you about the impacts of online harm and what can be done about it!
- Download our campaign images here to spread the word even further.
KEY MESSAGES
You are welcome to use any of this content when taking action. Our top messages include:
Online harm is a serious issue
Online harm affects both individuals and society. Without proper safeguards, digital platforms have created toxic environments that cause harm both online and offline.
Platform design is driving harm
Search engines and social media platforms are designed to promote content that drives engagement, regardless of its harmful effects. This allows harmful material to spread widely.
We need smart regulation
Aotearoa New Zealand has the opportunity to enhance digital spaces by implementing sensible regulation to address this fundamental design flaw.
Regulation must ensure accountability
This campaign focuses on tech company accountability. Law should create a thorough and consistent framework for online safety, including transparency and accountability, a duty of care, independent oversight, and penalties for non-compliance. The Government must uphold its obligations under Te Tiriti o Waitangi and work with Māori to develop regulation.
THE CALL
We want to see law that creates a thorough and consistent framework for ensuring online
safety.
This framework should include requirements of transparency and accountability, a duty of care, independent oversight, and penalties for non-compliance.
Importantly the Government must uphold its obligations under Te Tiriti o Waitangi and work with Māori to develop regulation.
The intent is to increase accountability by tech companies for the harm occurring.
We think the main parts should include:
- Transparency: Tech companies should clearly show how their algorithms work, like what content they recommend, what they remove, and how complaints are handled.
- Duty of care: Tech companies must actively make sure their products and services are safe by design. This means having strong checks to find risks and ways to reduce them.
- Independent oversight: There should be outside monitoring with the power to penalise companies that don't follow the rules. Tech companies must also provide reports to show if they are following these rules.
In developing regulation, it is important to hear from people most harmed to ensure the resulting regulation and policy is fit for purpose.
The call does not include a ban on social media for people under 16 as the focus here is on holding tech companies to account.
We want rules that focus directly on tech companies in order to fix the root causes of online harm. And the time is now, as Minister Erica Stanford is currently thinking about what actions we can take in this area.
In fact, similar government action is already being done by other countries around the world, including Australia, the European Union and the United Kingdom.
HELPFUL RESOURCES
- Amnesty reveals alarming impact of online abuse against women
- Myanmar: Facebook’s systems promoted violence against Rohingya; Meta owes reparations
- Driven into the darkness
- Media and online content regulation
- Safer Online Services and Media Platforms
- Safer Online Services and Media Platforms Summary of Submission
- Youth Wellbeing Insights Report
Note: See this document for an explanation about what we mean when we say algorithm. Algorithms can be used by platforms to promote specific content to individual users or to identify and remove illegal content.
