Aerial view of a structured New Zealand urban grid representing data classification and governance frameworks
COMPLIANCE

What NZ Marketers Need to Know About Data, AI, and the Law

PM
Peter Mangin
Founder, AI Innovisory
10 min read

AI generated image

This article is practical guidance for NZ marketing professionals. It is not legal advice. Where significant risk or uncertainty exists, seek advice from a qualified NZ privacy lawyer or the Office of the Privacy Commissioner at privacy.org.nz.

The situation right now

Your team is almost certainly using AI. Prompts are being written. Customer feedback is being summarised. Campaign copy is being drafted. Data is moving from CRM exports and ad platforms into tools that were never designed with your customer's privacy in mind.

That is not unusual. But from 1 May 2026, the legal landscape changed.

A new rule, IPP3A, came into force under the Privacy Act 2020. It requires any organisation that collects personal information about someone from a source other than that person to notify them of that collection. This is not a future obligation. It applies now, to data you may already be using.

This article gives you a clear framework for understanding what data you are handling, what the law requires, and what you must do before putting any data into an AI tool. If you want to run the checks right now, use the NZ Data Classification Checklist , a free interactive tool covering exactly these obligations.

The three questions that govern everything

Before you put any data into an AI tool, ask these three questions:

Before any AI use: ask these first

1

Does this data identify, or could it identify, a living person?

If yes, classify the data carefully before proceeding.

2

Did that person give it to us directly?

If no or unsure, you have an IPP3A obligation to address before you proceed.

3

Does it involve Māori data, sensitive data, or confidential information?

If yes, stop and seek appropriate review.

Everything else flows from these three questions.

What changed on 1 May 2026: IPP3A explained

What is IPP3A and when did it come into force?

IPP3A is Information Privacy Principle 3A, added to the NZ Privacy Act 2020 by the Privacy Amendment Act 2025. It requires organisations to notify individuals when their personal information is collected from a third party. It came into force on 1 May 2026.

The Privacy Amendment Act 2025 added IPP3A to the Privacy Act 2020. In plain English, it works like this:

If you received personal information about someone from anywhere other than that person directly (a partner, a client, a list provider, an ad platform, a data enrichment service, a CRM import), you must take reasonable steps to notify them of the collection. You must tell them:

  • 1That their information has been collected
  • 2Why it was collected (specific purpose, not vague placeholders like "business purposes")
  • 3Who will receive it
  • 4Who collected it and holds it
  • 5Which law authorised the collection, if applicable
  • 6Their right to access and correct that information

Notification must happen as soon as reasonably practicable after collection. The Office of the Privacy Commissioner has confirmed that including this information in outbound marketing communications, at the point of first contact, is an acceptable approach, provided the delay is reasonable.

The IPP3A trigger question every marketer needs to use:

Did this personal information come directly from the person, or from somewhere else? If the answer is "somewhere else" or "not sure": stop and complete an IPP3A check before using that data in any AI tool.

Exceptions exist, but they are narrow

There are documented exceptions to IPP3A notification, including where the person was already notified by the source organisation, where the information is genuinely publicly available, or where notification is not reasonably practicable. Critically, marketing profiling and targeted advertising do not qualify for the "no prejudice to interests" exception. Apply exceptions carefully, document your reasoning, and get evidence; not just assurances from suppliers.

How to classify your data: a six-level framework

Before selecting any AI tool, classify the data you intend to use. The classification level determines which tools are appropriate and what preparation is required. Use the interactive Data Classification Checklist to work through each decision step by step.

LevelTypeExamplesAI Tool Guidance
L1PublicWebsite copy, product descriptions, press releases, brand guidelines, public social media postsMost tools, including free public AI. Review output before publishing.
L2InternalCampaign calendars, internal planning notes, performance summaries, process documentsApproved business tools. Avoid free public tools unless policy permits.
L3ConfidentialLaunch plans, pricing strategy, media strategy, client briefs, unpublished concepts, board materialsApproved enterprise tools only. May require manager, legal, or client approval.
L4Personal InformationNames, emails, phone numbers, CRM records, customer feedback, event attendee lists, survey responsesIPP3A check required. Minimise, redact, or anonymise. Approved tools with DPA only.
L5Sensitive / High-RiskHealth, financial hardship, employment, ethnicity, children's data, behavioural profiling, Māori community dataGenerally avoid. Clear lawful purpose, strong governance, and appropriate approval required.
L6ProhibitedPasswords, API keys, raw customer databases in public tools, HR records, board papers, legal adviceDo not enter into AI tools under any circumstances.

Which tool for which data?

Not all AI tools are the same. Tool type matters as much as data classification.

Tool typeSuitable forKey consideration
Free public AI (ChatGPT free tier and similar)Level 1: Public onlyMay train on user inputs. Never use for personal, confidential, or Māori data.
Paid individual accountsLevels 1–2Not appropriate for personal information or confidential data without additional controls.
Business and enterprise tools with DPALevels 1–3; Level 4 after IPP3A checkData processing agreement required. Verify data residency and subprocessor arrangements.
Tenant-based tools (Microsoft 365 Copilot, Google Workspace AI)Levels 1–4 with checksOperates within your org's tenant. Verify licence terms, data residency, and subprocessors before use.
AI agents connected to CRM, inboxes, or ad platformsRequires full governance reviewEvery data connection requires separate assessment. Creates indirect collection risk by default. Do not deploy without a governance review.

AI agents deserve particular attention. When an agent connects to your CRM, inbox, or ad platform, it is accessing data that may include personal information obtained indirectly, triggering IPP3A obligations you have not yet assessed. See the guide to agentic AI governance for a full treatment of this risk.

Need a practical workshop on data classification and AI compliance for your marketing team?

Peter runs hands-on AI workshops specifically for NZ marketing teams, covering data classification, IPP3A obligations, Māori data sovereignty, and how to build responsible AI habits across your organisation.

Book a workshop for your team

The most common mistakes NZ marketers make

These are not hypothetical edge cases. They are patterns observed across NZ marketing teams adapting to AI under the revised Privacy Act.

1Assuming "publicly available" means "free to use"

The IPP3A publicly available exception applies to public registers, newspapers, books, and public websites; it does not apply to all content found online. Copyright, platform terms, and privacy obligations apply independently.

2Reusing customer data for any purpose

Personal information collected for one purpose cannot simply be applied to a different one. AI-assisted analysis or profiling may be a materially different use from the original collection context.

3Uploading whole CRM exports when only a summary is needed

Data minimisation is a core Privacy Act obligation. If the AI task only requires aggregate patterns, upload category-level summaries, not a database of tens of thousands of names and email addresses.

4Confusing pseudonymised with anonymised data

Replacing names with codes is pseudonymisation, not anonymisation. If re-identification is possible using other fields (email domain, job title, location, and purchase history in combination), the data is still personal information.

5Ignoring indirect collection

The most common indirect collection scenarios in NZ marketing are purchased lists, enriched contact data, platform-exported audiences, client-supplied databases, and agency-to-agency data sharing. All of these trigger IPP3A from 1 May 2026.

6Failing to check whether a tool trains on inputs

If an AI tool trains on your inputs, the personal or confidential data you include may influence future outputs to other users. Check current training settings; defaults may allow training unless you opt out.

7Publishing AI outputs without human review

AI output can contain errors, privacy-violating inferences, bias, or brand-inconsistent content. Every AI-generated output that will be published, sent to customers, or used to inform decisions about people requires human sign-off first.

8Treating Māori data as just another sensitivity label

Māori Data Sovereignty is a governance framework, not a data sensitivity category. Individual consent from a Māori person does not extinguish collective Māori rights in that data. It requires genuine engagement with Māori communities.

Māori data and AI: a different kind of obligation

What is Māori data sovereignty?

Māori data sovereignty is the right of Māori to govern how data about Māori people, communities, and resources is collected, owned, and applied. It requires collective governance from iwi and hapū, not just individual consent, and is grounded in Treaty of Waitangi obligations. Te Mana Raraunga formalised this as a distinct rights framework from 2015.

Māori data includes any data that is about, from, or concerns Māori people, Māori communities, or Māori interests. This includes data about te reo Māori, mātauranga Māori, tikanga, iwi, hapū, marae, and Māori lands and environments.

The Te Mana Raraunga framework sets out six principles (Rangatiratanga, Whakapapa, Whanaungatanga, Kotahitanga, Manaakitanga, and Kaitiakitanga) that govern how Māori data should be collected, held, and used. These principles assert Māori rights and interests as pre-existing and ongoing, not contingent on consent alone.

Before using any Māori data with AI, ask:

  • Could this data relate to Māori people, communities, iwi, hapū, tikanga, or te reo?
  • Have we sought Māori data governance advice?
  • Will this use benefit or harm Māori communities?
  • Is AI the appropriate tool for this task at all?

Do not use AI to generate te reo Māori content without expert human review. Do not generate campaign imagery that depicts Māori cultural items (tā moko, korowai, koru, or taonga) without express cultural authority. Do not profile or segment Māori communities for commercial targeting without collective consent and genuine benefit-sharing.

For a deeper treatment of these obligations and how they intersect with NZ AI strategy, read Two AI Conversations in New Zealand. They Have Barely Met. If in doubt on a specific situation, seek advice from Te Mana Raraunga before proceeding.

A practical workflow before any AI use

For low-risk public data, this takes less than a minute. For personal or sensitive data, take every step seriously. The Data Classification Checklist walks you through the most critical steps interactively.

1
Define the task. What are you trying to accomplish? Vague tasks lead to unsafe data choices.
2
Identify the data source. Where did this data come from? Who collected it, when, and for what purpose?
3
Check for personal information. Does any element identify, or could it identify, a living person? Check every field, including free-text, metadata, and IDs.
4
Direct or indirect collection? Did the person give this information to your organisation directly? If indirect or unsure, run an IPP3A check before proceeding.
5
Complete the IPP3A check. Has the person been notified of all six required matters? Does a documented exception apply? Do not proceed to AI use until this is resolved.
6
Check for Māori data. Could this data relate to Māori people, communities, or interests? If yes or unsure, seek Māori data governance advice.
7
Classify the data. Apply the six-level framework. The highest classification level present governs your tool choice and preparation requirements.
8
Check consent, purpose, and permitted use. Does AI-assisted processing fall within the original collection purpose?
9
Remove unnecessary data. Apply data minimisation. Upload only what the specific AI task requires.
10
Anonymise, aggregate, redact, or mask. Remove or obscure personal identifiers where possible. Check that combinations of remaining data cannot re-identify individuals.
11
Select an appropriate tool. Match the classification level to the tool type. Check data processing agreements, training terms, data residency, and retention policies.
12
Check the tool's settings. Confirm training opt-out where available. Verify data residency is acceptable. Confirm the tool acts as a processor on your behalf.
13
Write the prompt carefully. Treat the prompt itself as data. Do not include unnecessary personal information. Do not paste raw data extracts when a question would suffice.
14
Review the output. Before use, check for accuracy, privacy risk, bias, unfair inference, and brand appropriateness. Human sign-off is required.
15
Store, delete, or document. Apply your data retention policy to both inputs and outputs. Delete data you no longer need. Keep records for decision-relevant outputs.

Can I use ChatGPT or free AI tools with personal information?

No. Free public AI tools may train on user inputs and lack data processing agreements. Under the NZ Privacy Act 2020 and IPP3A, they are appropriate for Level 1 public data only. Personal information (Level 4) requires an approved enterprise tool with a formal DPA.

What good looks like in practice

Five common NZ marketing scenarios, and the right approach for each.

Using AI to generate ad copy from your website

Level 1

Generally suitable for most tools. Review output for accuracy and brand alignment. Check copyright. Do not include unreleased pricing or strategy in the prompt.

Summarising customer reviews for insight

Level 2–3

Remove names and usernames first. Upload anonymised text only. Check the review platform's permitted uses. Use an approved business tool, not a free public one.

Analysing campaign performance data from Meta or Google

Level 2

If genuinely aggregated and no individual can be identified, likely Level 2. Confirm client permission for AI analysis of their campaign data. Use an approved business tool.

Using call transcripts to improve messaging

Level 5

Seek legal and privacy advice before proceeding. Redact names, account numbers, and identifiers. Use only an approved secure enterprise tool. Check whether original call recording consent covers AI analysis.

Receiving a client's customer database for campaign analysis

Level 4–5

You are acting as a data processor. You must have a DPA with the client. The client must have appropriate privacy notices covering your AI use. Confirm this before you do anything with the data.

The bottom line

The Privacy Act 2020 has always required organisations to treat personal information with care. IPP3A extends those requirements to data you received from third parties. Māori Data Sovereignty requires something more: genuine governance engagement, not a sensitivity flag.

The tools your team uses every day are processing data that belongs to real people. The question is whether that processing is safe, lawful, and defensible.

Four principles to carry forward:

1Classify before you upload.
2Check the source before you use.
3Review before you publish.
4Document where it matters.

Start with the workflow, not the technology.

Based on the Privacy Act 2020 (NZ), IPP3A (in force 1 May 2026), Te Mana Raraunga Māori Data Sovereignty Principles, and Digital.govt.nz responsible AI guidance. Not legal advice. Contact the Office of the Privacy Commissioner at privacy.org.nz for specific guidance on your situation.