Reference: IR-01-22-11257
19 May 2022
Harold
[FYI request #19184 email]
7ƝQƗNRH Harold
Request for information
7KDQN\RXIRUyour Official Information Act 1982 (OIA) request dated 20 April 2022 in
ZKLFK\RXDVNHG for information regarding WKH1HZ7HFKQRORJ\)UDPHZRUN. You
requested the following:
“Earlier this month feedback from the Expert Panel on Emergent Technology was
published on the Police website, relating to an update to the Police emergent
technology policy and the development of the New Technology Framework:
https://www.police.govt.nz/about-us/publication/policy-trial-or-adoption-new-policing-
technology
The updated policy “Trial or adoption of new policing technology” was published along
with this feedback, but the Framework wasn’t, and doesn’t seem to be available
anywhere else on the Police website.
I would therefore like to make a request for a copy of the New Technology Framework.”
3OHDVHILQGDW DFKHGDFRS\RIWKH1HZ7HFKQRORJ\)UDPHZRUNDVUHTXHVWHG
1JƗPLKL
Carla Gilmore
Manager: Emergent Technology
New Zealand Police
Police National Headquarters
180 Molesworth Street. PO Box 3017, Wellington 6140, New Zealand.
Telephone: 04 474 9499. Fax: 04 498 7400. www.police.govt.nz
New
Technology
Framework
Contents
Contents ....................................................................................................................................... 2
1. Overview ................................................................................................................................ 3
2. Trial or Adopt New Policing Technology Policy purpose and scope ..................................... 3
3. Principles ................................................................................................................................ 7
4. Guidance for applying the Principles ................................................................................... 10
5. Assessment process ............................................................................................................ 12
6. New Technology Working Group – Purpose and Membership ........................................... 15
7. Expert Panel on Emergent Technology ............................................................................... 15
8. Technology Proposal, Policy Risk Assessment, and Algorithm Guidelines ........................ 16
9. Process following approval of technology ............................................................................ 19
Document history
Version
Date
Description
1.0
31/03/2021 Initial version
2.0
12/04/2021 Internal review feedback incorporated
3.0
30/11/2021 External review feedback incorporated from Expert Panel and
external expert advisors
Document Approval:
Approved by Organisational Capability Governance Group on 13/07/2021
2 | P a g e
1. Overview
The New Technology Framework has been developed to assist decision-making on new
technology, and includes:
•
Policy purpose and scope
•
Principles against which a Technology Proposal should be assessed
•
Guidance to assist in applying the Principles to a specific proposal
•
A process to guide the structured assessment of a proposal through to formal decisions
•
The requirements for documents to support wel -informed decision-making, transparency,
and accountability
•
The process fol owing approval of proposals
The Framework aligns with our
SELF CHECK tool to help make the best decisions possible:
- Would it withstand
Scrutiny? (Community, Police service, Media and Online)
- Is it in line with our
Ethics? (Our Code, Our Values, High performing culture)
- Is the decision
Lawful? (Laws, Regulation, Policies and guidelines)
- Is the decision
Fair to all? (Community, Colleagues and whānau, People’s individual
circumstances)
2. Trial or Adopt New Policing Technology Policy purpose and scope
Why do we have this policy?
Our system of policing by consent is based on trust and confidence in the way New Zealand
Police delivers its services, and the social license granted by the community to police in the way
that we do. Concerns about overly wide access to certain technologies, or a lack of clarity around
how certain policing decisions are made, can undermine public trust and confidence and Police’s
social license. Police also needs to ensure the use of new technology is lawful.
Being clearer about the basis on which New Zealand Police engages with new technologies can
help dispel any unfounded concerns, and reinforce Police’s commitment to carefully weigh
privacy, legislative, security and ethical considerations before making decisions about new
technology. The policy can also support an ongoing conversation about the role of technology-
enabled capabilities in policing, set in the particular context of Aotearoa/New Zealand.
Purpose and scope
Trial or Adopt New Policing Technology Policy purposes
The purposes of the policy on trial or adoption of new policing technologies are to:
•
Ensure decisions to trial or adopt new and evolving policing technologies and
technology-enabled capabilities are made ethical y and proportionately with individual
and community interests
•
Ensure Police’s approach aligns with wider New Zealand Government ethical
standards and expectations; including the Government Chief Data Steward’s and
3 | P a g e
Privacy Commissioner’s
Principles for the safe and effective use of data and
analytics1, and Statistics New Zealand’s
Algorithm Charter for Aotearoa New Zealand2
•
Ensure decisions reflect Police’s obligations to Te Tiriti o Waitangi including by
seeking and taking account of a te ao Māori perspective
•
Enhance public trust and confidence by ensuring decisions and the reasons for them
are transparent, and decision-makers are accountable
•
Enable Police to innovate safely, so that opportunities offered by technology to deliver
safer communities and bet er policing outcomes for New Zealanders are not lost.
Scope
The policy applies to any proposed trial or adoption of new technology. It extends to situations
where extra functionality is being added or turned on to an existing technology.
The policy may apply to any type of technology. The scope includes novel technologies such
as artificial intelligence (AI), drones, machine learning or algorithm-based software, and ‘new
tech’ capabilities, such as use of chat bots or other digital y-enabled management tools, and
3D photogrammetry. It also includes more established technologies which al ow for images to
be captured (such as use of Closed Circuit Television Cameras [CCTV]) and/or matched
(such as Automatic Number Plate Recognition [ANPR]).
The policy applies:
•
where
new or enhanced policing capability is proposed, whether or not the
technology itself is new (‘new capability’); or
•
where existing technology is proposed to be used for a
new or evolved policing
purpose (‘new use’); and
•
it is proposed by Police either to
trial or adopt the new capability or new use (whether
or not a trial has previously been approved under this policy); or
•
the new capability or new use has been, or wil be,
passively acquired by Police (for
example, as a result of vendor-initiated product enhancement).
The policy does not apply where:
•
existing technology (software or hardware) is subject to end-of-lifecycle replacement,
iterative version upgrades, security patching or other minor enhancements (such as
new user interface), if the replacement or upgrade does not add significant new
policing capability or enable its use for a new policing purpose; or
•
the proposed new capability, or new use, would not enable a core policing function,
because:
o it will
not affect Police interactions with the public in any way (either directly or
indirectly); and
o it will
not gather new, additional, or improved data from or about members of
the public including offenders or victims.
Examples of core policing functions:
1 https://www.privacy.org.nz/assets/Uploads/Principles-for-the-safe-and-effective-use-of-data-and-analytics-
guidance.pdf
2 https://data.govt.nz/use-data/data-ethics/government-algorithm-transparency-and-
accountability/algorithm-charter
4 | P a g e
Examples of technology capabilities or uses which would be considered to affect Police
interactions with the public, and are therefore
in scope of the policy as core policing functions,
would include technologies that:
•
might influence or change public-facing deployment or response decisions
•
help to detect offending
•
assist in investigations
•
generate leads or influence targeting or prioritising of investigations
•
identify suspects or discover potential evidence
•
use of equipment, like Remotely Piloted Aircraft Systems (RPAS), to survey scenes
and provide situational awareness
Examples that would most likely not be considered as enabling core policing functions and
therefore
not within scope include technologies that:
•
work only with Police’s own internal corporate organisational information (such as HR
systems to support personnel)
•
assist decision-making on resource al ocation only at an internal-facing, non-
operational level
•
affect only internal, non-operational, and non-investigative workflows.
Further guidance on scope
An initial policy assessment decision tree diagram is contained in the New Technology
Framework to assist in determining whether or not the policy applies in a specific case.
Particular attention should be focussed on technologies that are significantly based on:
•
artificial intel igence or machine learning
•
algorithm-based risk assessment or decision support
•
gathering or analysing data which relates to members of the public, including
individual offenders
•
biometrics: the fully or partially automated recognition of individuals based on
biological or behavioural characteristics3
•
the possibility of public place or online surveil ance perceived or otherwise
(irrespective of whether the provisions of the Search and Surveil ance Act are
considered to apply).
These technologies are likely to be inherently higher-risk and so application of the policy to
them should be considered the default position.
The lawfulness of a proposed new capability or new use is not a factor which determines
whether or not this policy applies.
As transparency and accountability are key objectives of this policy, where there is any room
for doubt, the policy should be assumed to apply.
Scope of approvals
3 “There are many types of biometrics using different human characteristics, including a person’s face,
fingerprints, voice, eyes (iris or retina), signature, hand geometry, gait, keystroke pattern, or odour.” Office of
the Privacy Commissioner (OPC),
Position on the Regulation of Biometrics, October 2021, p. 2.
https://www.privacy.org.nz/assets/DOCUMENTS/2021-10-07-OPC-position-on-biometrics.pdf.
5 | P a g e
Any governance approval gained under this policy are limited by the relevant governance
group’s mandate: that is, to assess whether a technology proposal is justified and compatible
with privacy, security, legal, and ethical principles. Such approval does not replace or remove
the need for a business owner to comply with any other applicable policies including obtaining
appropriate financial authorisations.
Initial Policy Assessment Decision Tree Diagram
6 | P a g e
3. Principles
Introduction to the Principles
Ten Principles underpin the decision-making framework. These are intended to be technology
agnostic and not linked to any specific technology. That is, they should be able to be usefully
applied (as relevant) both to novel or advanced technologies such as algorithms and AI, and also
simpler technologies such as CCTV or body-worn cameras.
Similarly, they should be applicable to new technology capabilities or use-cases, whether they
utilise new or existing technology platforms; and to technology proposals either to trial or adopt
the technology.
The Principles are il ustrated by a series of statements, which are intended to aid understanding
of what the Principles mean and help to structure assessment of whether a proposal aligns with
the Principles or not. All technology proposals should align with each of the statements, to the
extent they apply in the particular use-case.
Alignment should be demonstrated before a technology proposal is approved via the two-step
governance approval process for trial or adoption. Alignment with one or more principles may be
provisional y assessed at the point of a proposal to trial if it is intended that the trial and
evaluation wil further investigate the issue.
Focus on use of the technology
Principles-based assessment requires a central focus on the proposed use-case – not just on the
technology itself. That is because a technology itself is seldom likely to be either inherently
harmful or beneficial: what matters most, in terms of both ethical decision making and public
acceptability, is how it is used, the impacts, and whether the benefits of that particular use
outweigh or are proportionate to the harms.
7 | P a g e
Principle 1: Necessity
•
There is a demonstrable, legitimate need for Police to acquire the capability the
technology is intended to deliver and a clear problem statement
•
Use of the technology supports and achieves the strategic direction of Police,
including Our Business and the Executive Strategic Performance Template (SPT)
•
The technology wil deliver an identifiable public or policing benefit that would not
otherwise accrue, assist Police to meet relevant legislation, or lessen a significant risk
that would otherwise be present. The benefits of these technologies may be
quantitative in nature, such as increased crime prevention and efficiency of police
operations, or qualitative, such as an increase in public trust and an improved
perception of Police
Principle 2: Effectiveness
•
The technology design is wel understood and explainable to users and impacted
people
•
The policing objectives of the technology’s proposed use are well defined and
explainable
•
There is an evidence base or other good reason to believe the technology wil be
effective in delivering these objectives
•
Output of the proposed technology wil be fit-for-purpose and of the quality necessary
to support the intended use (such as intelligence, investigative, evidential, or forensic
use)
•
The proposed technology has been used or tested in a New Zealand context, to
ensure compatibility with New Zealand Police outcomes
Principle 3: Lawfulness
•
The technology’s proposed use is reasonable and lawful
•
The technology is applied only within the agreed scope of the proposal
Principle 4: Partnership
•
Te Tiriti o Waitangi has been considered in the design and proposed use of the
technology
•
A te ao Māori perspective has been considered in assessing the possible or percieved
impacts of the technology or its use on Māori
•
If the technology includes any form of data collection and use, relevant mechanisms
are in place to ensure data is treated as taonga and Māori sovereignty is maintained
•
Māori, Pacific and/or other communities have been involved in co-design or consulted
and any concerns responded to
•
Business owners have liaised with MPES to evaluate the potential effects and risk
mitigation strategies for Māori, Pacific and/or other communities
8 | P a g e
Principle 5: Fairness
• Any possible biases or perceived unfairness arising from the technology design, or its
proposed use are clearly identified and able to be mitigated4
• The proposed use of the technology is appropriate
Principle 6: Privacy
• The technology incorporates privacy by design in data sourcing, use, retention and
storage
• Privacy impacts in an operational environment have been assessed and identified
privacy risks will be mitigated
Principle 7: Security
• Appropriate data governance wil ensure data is handled and stored securely, and
data quality and integrity are assured
• Information security and operational security risks have been assessed, including data
intrusion risks, and identified risks wil be mitigated
Principle 8: Proportionality
• Impacts on the human rights and interests of individuals, particular groups or
communities, and the collective public interest of the community as a whole have been
considered
• A te ao Māori perspective has been considered
• The following impacts have been considered:
o privacy, safety, security and other impacts on affected
individuals (e.g.
suspects, of enders, victims, staff, members of the public)
o the collective human rights and interests of
particular groups or
communities affected (e.g. communities of ethnicity, age, gender or diversity,
or particular geographical communities)
o human rights and the col ective public interest of the
community as a whole
• Any negative impacts are proportionate to the necessity and benefits of the proposal
• No other identified alternative solution or strategy, that is viable having regard to cost
and other feasibility considerations, would meet the need and deliver the benefits with
less negative impact
Principle 9: Oversight and accountability
• The proposed technology has been assessed and/or peer reviewed for technical
adequacy
• Risks including privacy, security, te ao Māori, human rights and ethical risks identified
have been mitigated and residual risk is within acceptable margins
4 If the technology is at an early (early procurement, pre-trial) stage, it may not always be possible to identify all
possible biases, if a previously unidentified bias arises, further consultation with the Emergent Technology
Working Group may be required.
9 | P a g e
•
Policy, process, audit and reporting controls have been developed to assure that the
technology is used only as intended
•
Review processes have been developed for trial evaluation and/or to monitor
operational performance, to measure whether the technology is delivering the
intended outcomes and benefits
•
The system in which the technology is deployed, and any substantive decisions made
by or based on the technology’s output (such as resource deployment, identification of
possible suspects, or enforcement decisions) are subject to active human oversight
Principle 10: Transparency
•
The technology, the way it is to be used, and the rationale for any decisions made by
the technology itself, or by people on the basis of its output, are understood by those
assessing and operating it, and are clearly explainable to others
•
Mechanisms (whether general or specific to the technology use) exist for individuals or
groups adversely affected by the technology to chal enge or seek review of decisions
•
Information about the technology, its proposed/authorised use(s), justification for these
uses, and oversight and accountability mechanisms wil be published or otherwise
made freely available to the public, to the greatest extent possible (having regard to
operational security, commercial and other considerations)
•
Assessments, evaluations, reviews, audits and other reports wil be published or
otherwise made freely available to the public, to the greatest extent possible (having
regard to operational security, commercial and other considerations)
4. Guidance for applying the Principles
The Principles should be considered throughout the development, assessment, and approval of
a technology proposal. Alignment with the Principles wil need to be demonstrated prior to
approval being given to trial or adopt a technology. Broad guidance on how the principles should
be considered at each stage of technology proposal development, approval and endorsement is
described below.
Guidance for applying the Principles
a. Pre-proposal
Technology design/selection
The need for alignment with the Principles should be incorporated into early design decisions
or product selection criteria. In much the same way as ‘privacy by design’ should apply,
designing to align with the Principles could be considered ‘ethical by design’.
Use-case decisions
The Principles should be considered in developing the proposed use-case (for example, to
help decide from the outset which uses are likely to be proportionate and which are not).
1 0 | P a g e
Trial and evaluation design
The Principles should be considered when determining the parameters of the trial proposal,
and its proposed evaluation. For example, where alignment with certain Principles may be
unclear, these may be issues that need to be explicitly investigated through the trial and
evaluation.
b. Technology Proposal
It is useful for the Technology Proposal developed at process Step 2 (described in the next
section) to include information that is relevant to assessing alignment with the Principles.
c. Policy Risk Assessment (PRA)
The advice provided to the Governance Groups to inform its decision (process Step 4
described in the next section) should include a structured assessment of the proposal against
each of the statements contained in the Principles. This is the point at which alignment with
the Principles is comprehensively assessed.
The Policy Risk Assessment is the arms-length advice of the Emergent Technology Working
Group. It represents the analysis and judgment of the Group, and wil usual y be based on the
information presented in the Technology Proposal, New Technology Working Group, Privacy
Impact Assessment (PIA) / Information Security Risk Assessment (ISRA) (if commissioned),
and any other relevant information.
d. Security Privacy Reference Group (SPRG) approval
The SPRG decision as to whether or not to approve a Technology Proposal is informed by
formal advice and recommendations to consider the security and privacy impacts . A
structured assessment of alignment with the Principles, via the Policy Risk Assessment, is
central to the advice provided to it.
e. Organisational Capability Governance Group (OCGG) endorsement
OCGG’s decision as to whether or not to endorse an approval is informed by formal advice
and recommendations. OCGG wil have access to the advice which informed the SPRG,
including the Policy Risk Assessment, and may overrule the SPRG judgment at its discretion.
f. Outcome of test or trial
Following a test or trial of new technology, the Manager: Emergent Technology must be
advised about the outcome (see
Process fol owing approval of technology).
1 1 | P a g e
5. Assessment process
A five-step assessment process to reach a principled decision on a new technology use proposal
is described in the shaded box below.
The process presupposes that the business group has already done desktop research/evaluation
to develop a proposal that is sufficiently advanced to allow for meaningful assessment against
the Principles, and assessment of privacy, security, legal and ethical implications.
The process mandates three documents to be created in every case that falls within scope of the
policy:
•
A
Technology Proposal produced by the business group owner
•
A Policy Risk Assessment produced by the Emergent Technology Working Group
•
Governance cover papers, produced by the Emergent Technology Working Group, which
wil also include specific advice on te ao Māori, and algorithm-related considerations
(where relevant)
Further documents may also be required on a case by case basis, including:
•
A Privacy Impact Assessment
•
An Information Security Risk Assessment
•
Information on algorithms to demonstrate compliance with the Guidelines for algorithm
life-cycle management
•
Other supporting documents or expert reports as required
Technology Proposal assessment: process overview
Step
Who
What
1. Does
Business owner Consider whether the policy applies. Consult the initial
this policy
(proposer) in
assessment decision tree diagram and complete the
apply?
consultation
Initial Assessment Form to assist in making this
with Emergent
judgment. If the policy does not apply, advise the
Technology
Emergent Technology Group in writing of the existence
Group
of the technology proposed and the business owner’s
determination that the policy does not apply. This is
required to ensure complete records are maintained of
the organisation’s use of technology tools. No further
steps are required.
2. Develop
Business owner Complete the Technology Proposal document. The
Proposal
(proposer)
document template contains guidance to assist in
completing this step. The Proposal document is sent to
the Emergent Technology Group.
3. Contact
Business
Provide Technology Proposal to the Emergent
Emergent
Owner
Technology Group. The Emergent Technology Group
Technology
wil review the proposal and confirm the proposal is
Group
within scope of the policy, is sufficiently well-developed
to advance, if the proposed technology relies
substantively on an algorithm and provide advice
whether the Guidelines for algorithm development and
life-cycle management is required to be adhered to,
1 2 | P a g e
and/or other expert input (such as a te ao Māori
perspective) are required.
4. Consider
Emergent
Consider the proposal at a New Technology Working
the
Technology
Group meeting, with input from the Business Owner and
proposal and
Group and New complete required documentation.
develop a
Technology
Policy Risk
Working Group The Emergent Technology group produce a Policy Risk
Assessment
Assessment (PRA) conducted against the Principles.
(and Privacy
Impact
If it is necessary to commission any further specialist
Assessment,
advice to support the PRA (for example, to consider a te
Information
ao Māori perspective) this may be done at this stage.
Security Risk
The Governance Group paper makes clear
Assessment if
recommendation on the proposal and may also
required)
recommend that the proposal be referred to the Expert
Panel on Emergent Technology for independent advice
to inform further consideration by the Security Privacy
Reference Group (with approval deferred), or as
supplementary advice to inform Organisational
Capability Governance Group’s consideration of
endorsement.
The paper should provide specific advice on the two
special considerations (te ao Māori perspective; and
whether algorithm guideline adherence should be
mandated), and may recommend conditions be attached
to governance approvals as appropriate.
The papers submit ed to the Governance Groups should
include as attachments:
•
The Policy Risk Assessment
•
Privacy Impact Assessment / Information
Security Risk Assessment (if conducted)
•
Any specialist or Expert Panel advice received
on the proposal or other relevant supporting
information as required
•
The Algorithm Questoinnaire
(as required)
4.a Contact
Chief Privacy
Produce Privacy Impact Assessment / Information
other experts Officer (CPO) / Security Risk Assessment /other expert assessment (as
as required
Chief
appropriate) in consultation with business owner.
Information
Adjustments to the proposal may be made to address
Security Officer issues, for example by refining the use-case or
(CISO) / other
introducing new controls to the proposal.
subject-matter
expert (as
appropriate)
5. Two step SPRG
Receive advice and recommendations from the
Emergent Technology Group and decide whether or not
1 3 | P a g e
Governance
to approve the proposal. This decision wil be based on
approval:
the proposal meeting security and privacy requirements.
SPRG may refer the proposal to the Expert Panel on
Step 1 -
Emergent Technology or any other key stakeholders for
approval
independent advice.
decision by
Security
NB: Should a request be required in an emergency
Privacy
situation; it can be raised directly to the delegated
Reference
Executive Lead for Organisational Capability
Group
Governance Group via the Manager: Emergent
Technology.
(SPRG)
5a. Two step
Delegated
Review and decide whether or not to endorse the SPRG
Governance
Executive Lead decision. The Executive Lead should be informed by the
approval:
for OCGG
same material presented to the SPRG and any further
relevant material produced since (for example, a
Step 2 -
description of new controls or proposal revisions made
endorsement
in response to SPRG comment or approval conditions).
decision by
Organisational
The Executive Lead may also refer the proposal to the
Capability
Expert Panel for Emergent Technology, if it has not
Governance
already been referred by the New Technology Working
Group
Group, Emergent Technology Group, or by SPRG.
(OCGG)
The Executive Lead makes decision on whether
proposals are referred to the whole OCGG for
consideration of endorsement.
The Executive Lead (or OCGG), wil decide whether
NZP wil proceed with trialling or adopting the
technology.
If endorsed by the Executive Lead (or OCGG), the
proposal may proceed within the approved parameters
subject to any other necessary approvals having been
gained (such as financial authorisation) under any other
applicable policies.
Note: It is anticipated that, in most cases, there wil be a
four-week period between the receipt of the proposal
and an approval decision.
1 4 | P a g e
6. New Technology Working Group – Purpose and Membership
The New Technology Working Group is a semi-formal group, convened by the Manager:
Emergent Technology to support new technology assessment and governance approvals
processes. Its advice wil be provided to business owners on a consensus/shared accountability
basis. Membership may vary but should include representation of/from:
•
Chief Privacy Officer
•
Chief Information Security Officer
•
Māori, Pacific and Ethnic Services
•
Legal
•
ICTSC
•
Evidence-Based Policing
•
Risk and Assurance Group
•
Other policing expertise relevant to particular proposals but arms-length from business
owners, such as Response & Operations, Prevention, High-tech Crime Group, District
representative (as appropriate)
The New Technology Working Group’s main purpose is to give initial consideration of a
Technology Proposal and provide semi-formal feedback to the business owner. New Technology
Working Group is engaged early in the process so that the Group can provide internal Police
expert perspectives, who can advise the business owner:
•
Whether, in their consensus view, the proposal falls within scope of the policy or not. This
provides a second opportunity to triage very low risk proposals out of the process
•
Whether the proposal is sufficiently wel -developed to proceed, or whether the business
owner should flesh out details in particular areas
•
Provide advice to inform the policy risk assessment from a security, privacy, legal and
ethical perspective and guiding principles
•
Whether supporting documents such as a Privacy Impact Assessment, Information
Security Risk Assessment, te ao Māori or other expert assessment should be produced
o The relevant expertise wil be present in the Working Group and the necessary
work can therefore be initiated immediately
•
Whether or not the technology appears substantively to involve the use of an algorithm,
and whether the guidelines for algorithm development and life-cycle management for
algorithm developers should also be followed
•
Any other relevant advice – for example, if a similar proposal has recently been
considered and the outcome of that consideration
Advice of the Group should be formally recorded for purposes which could include policy
evaluation, research, audit, and accountability. These may be required to be produced at a later
time.
7. Expert Panel on Emergent Technology
The Panel is an advisory body convened to give independent advice on proposals referred to it
by Police, in the form of recommendations and guidance for the consideration of the
Commissioner of Police. The Emergent Technology Group, SPRG and OCGG can refer
proposals to the Panel at any point of the proposal process, where the panels advice wil be
helpful to inform decision making.
1 5 | P a g e
The panel provides expert scrutiny, review, and advice on new technology which is a key part of
providing assurance within Police, and reassurance to the wider public, that privacy, ethical, and
human rights implications have been taken into account before decisions are made to trial or
adopt new technology capabilities.
The Panel is also responsible for advising Police of algorithms proposed (to ensure privacy, human
rights and ethics interests are appropriately safeguarded, and any unintended consequences are
identified), are in line with the Algorithm Charter for Aotearoa New Zealand5.
The Panel’s review work and advice to the Commissioner of Police is expected to consider
consistency with Te Tiriti, proactive partnerships with Māori, and implications for Māori, Pacific and
Ethnic communities.
The Panel comprises of an independent Chair and up to five other independent members. Panel
members wil col ectively have expertise in privacy, ethics and human rights matters; data and
technology; Te Ao Māori and an understanding of Māori data sovereignty issues; and public policy.
8. Technology Proposal, Policy Risk Assessment, and Algorithm Guidelines
Technology Proposal
The Technology Proposal is a summary of the proposed new technology use. A specification for
this document is contained in the shaded box below.
As described in the process overview, the Technology Proposal wil be used as a basis for
assessing whether a Privacy Impact Assessment and/or Information Security Risk Assessment is
required to inform the decision-making process. It is also the basis on which a Policy Risk
Assessment wil be conducted.
Technology Proposal Document
A Technology Proposal should include at least the following headings and information.
Technology description
•
What is the technology and how does it work? This should include an overview of the
technical functionality, including a description of data sources where relevant. If the
technology mainly relies on an algorithm to analyse data (e.g. to assess risk, make
decisions, or produce recommendations for staff action) this should be specifically
noted.
Necessity
•
What do Police need to be able to do (or do significantly better), that they can’t do
now? Outline how the proposal supports and achieves the strategic direction of the
organisation. Make specific linkage to Our Business and the Executive SPT. What
existing policing capability gap is the technology intended to bridge? This should be a
5 www.police.govt.nz/about-us/programmes-and-initiatives/police-use-emergent-technologies/algorithm-
charter
1 6 | P a g e
brief statement that describes an existing policing challenge or shortfall: for example,
in meeting a public interest in, or expectation of, service delivery or harm prevention in
a specific area.
Use case
•
What is the technology proposed to be used for? This should be a description of the
specific purpose, or kinds of situations in which the technology is intended to be used
(such as types of crime being investigated, or operational situations where the
technology would be employed). This should include an outline of the proposed ‘end
state’ deployment of the technology, as envisaged if a trial is successful.
Engagement
•
How have Te Tiriti o Waitangi and a te ao Māori perspective been considered in the
design and proposed use of the technology? How have Pacific and other communities
been considered?
Controls
•
How it is proposed to ensure that the technology is not used beyond its intended use
case. This could include, for example, policy guidance, legislative or regulatory
guidance, approvals processes, reporting and audits.
Proportionality
•
Consider the policing requirement versus implications on an individual or community.
Can the proposed solution be justified against the impact on people’s privacy or other
rights/expectations of fairness (e.g. use of their data, surveil ance of lawful activity,
perceived ‘targeting’); and in terms of the likely initial and ongoing financial/resourcing
costs to Police (to the extent the approximate scale of such costs may be known)?
Briefly describe any such impacts and costs and how they are justified, having regard
to the above (necessity, use case, controls). Reference to the Principles may help
identify possible impacts.
Trial and evaluation proposal
•
What the parameters are for the proposed trial (for example, how many users/devices,
in what locations, and for how long) and how the trial is proposed to be evaluated. This
should include a description of how the trial wil be determined to be a success.
How wil the technology be funded?
•
Wil this be funded within the Business Owner’s al ocated funding; or wil this require
an Investment Proposal through to Business Case? Advise options for the
testing/trial ing stage, and should the technology be implemented, consider ongoing
costs to maintain the technology. The implications of any proposal on investments,
finances, staffing, training, procurement and IT should be transparent. Give as much
relevant information as possible.
1 7 | P a g e
Policy Risk Assessment
A Policy Risk Assessment (PRA) is the structured assessment of the Technology Proposal
against the Principles. The PRA should state whether (and, if so, how) the proposal aligns with
each of the statements contained in the Principles, including whether the proposal is
proportionate and ethical. This analysis wil be informed by the Technology Proposal and any
further relevant information including any Privacy Impact Assessment / Information Security Risk
Assessment produced, and any supplementary specialist advice received (for example, advice
from a te ao Māori perspective).
The PRA wil form a key part of the advice to the Governance Group and should therefore be
presented in ful to support any recommendations made.
The PRA template requires an assessment against each and every statement in the Principles
(even if the assessment is simply “Not applicable”). It is important to demonstrate for the record
that the full spectrum of possible issues has been actively considered.
The PRA template contains guidance in the righthand column. This is not intended to be
prescriptive, and the assessor should apply their own judgement as to what is relevant or not in
order to demonstrate how compliance is achieved, and under what conditions (if any).
In time, as PRA assessors, consumers of the advice, and the wider organisation become more
adept at applying the framework, lower risk or less significant proposals might be satisfactorily
assessed against a briefer PRA which allows for aggregated commentary against each of the 10
Principles, to highlight only the most salient issues bearing on decision-making.
General guidance for completing a PRA
•
Each cell in the righthand column of the completed PRA should describe, in a few
sentences (2-5),
how the proposal complies with the corresponding statement. In some
cases a slightly longer commentary may be necessary. If significantly more lengthy
explanation is warranted, consider appendices.
•
Commentary should include any conditions/qualifications to the assessed compliance,
including any further work that is recommended to align with the principle.
•
Commentary should reference source documents (such as the Technology Proposal or
Privacy Impact Assessment) wherever possible, to demonstrate the basis on which
compliance has been assessed.
•
The PRA is an accountability document and is a key foundation for governance decision-
making. As such, if the evidence does not clearly support an assessment of alignment
with a given principle, invite the proposer to produce more evidence; and if doubt
remains, ensure that doubt is clearly recorded in the PRA.
Colour coding
Shade each box to provide a rapid visual aid for checking compliance with the principles.
Green means full compliance …
… while orange indicates conditional or qualified compliance, or compliance subject to
completion of further work.
Red would indicate that, at the time of assessment, the proposal could not demonstrate
compliance with the relevant principle.
1 8 | P a g e
Adherence with Guidelines for algorithm development and life-cycle management
While, technically, algorithms are utilised to greater or lesser degree in every computerised
technology, they are usually incidental to the policing capability delivered by the technology.
Some technologies, however, rely substantively on algorithms that may even have been
programmed specifical y to deliver or enable a policing function or capability. For example:
•
Algorithms which analyse data to generate risk assessments, prioritise interventions,
make decisions, or recommend actions
•
Algorithms which search data to identify possible evidence or persons of interest, or
scrape/extract/aggregate data of potential interest
In cases where an algorithm is central to the capability of a technology, appropriate design, use
and performance of the algorithm are key to assessing the proportionality of the policing
capability it enables. As well as assurances within the design and performance of an algorithm, it
is important to maintain human oversight to mitigate any risk of inaccuracy within algorithm-
informed decision-making.
The business owner is required to make an initial judgment of whether a technology is algorithm-
dependent as part of the Technology Proposal, and as part of adherence to the Guidelines for
algorithm development and life-cycle and compliance with the
Algorithm Charter of Aotearoa
New Zealand. The Guidelines cover the design, use, and management of the algorithmic
technology, such as how data is gathered, tested, inaccuracies, bias, how the data wil be used
throughout the technology’s lifecycle, and how it will be monitored. The Business Owner must
complete NZP’s Algorithm Questionnaire for algorithm lifecycle management to demonstrate
compliance with these guidelines and best practice. There are two checklists available:
• A questionnaire for use and approvals of
internally developed algorithms, to be
completed by the proposer/business owner
• A questionnaire for use and approvals of
third party algorithms, to be completed by the
proposer/business owner
and the supplier.
This questionnaire wil be further considered by the Emergent Technology Group, with advice
provided to the Governance Groups accordingly. Based on that advice, governance approvals for
technologies that depend on algorithms wil be conditional on adherence to the Guidelines. Both
the safety of the algorithm
and alignment with the Principles must be considered.
9. Process following approval of technology
Proposals to deploy technology after successful trial
Trial and evaluation of a technology proposal are likely to provide additional evidence about
possible privacy and other risks, and the effectiveness of control and mitigation strategies. Trial
experience might also result in refinements to proposed limitations on the uses of a particular
technology.
Therefore, a proposal to adopt for operational use or more widely deploy a technology, following
a successful trial, should also be subject to the process described by this policy. The process of
seeking further approval wil require updating of materials that were produced to support the trial
proposal: in particular, to ensure that decisions on final adoption are cognisant of evaluated trial
1 9 | P a g e
outcomes and any adjusted use parameters or controls. In most cases, the mandated
documentation wil have been created to support the earlier trial proposal. This documentation
wil simply need to be updated to reflect any changed parameters and evidence gathered through
trial and evaluation.
If evaluation has been positive, progression through the new technology governance approvals
process a second time should not in most cases delay progress in parallel through other
processes associated with full-scale adoption (such as development of a business case and/or
investment proposal).
Post-approval monitoring and oversight
Securing the SPRG approval and OCGG endorsement via the five-step process completes the
principled decision-making process established by the Framework. However, maintaining public
trust and confidence requires that Police is able to continue to demonstrate trustworthiness in the
use of technologies once approved for trial or adoption. The approvals process reflects this
through its focus on ensuring appropriate trial evaluation and use controls form part of the
Technology Proposal.
At an implementation level, ongoing assurance of good stewardship of policing technologies
(including assuring compliance with the approved parameters of trial or use) requires
maintenance of centralised records. Police has previously commit ed to establishing and
maintaining a centralised ‘stocktake’, and the advice from Taylor Fry makes similar
recommendations.
The centralised record should capture al proposals (including those that are assessed as not
requiring governance approvals); record the governance decisions on them (whether approval
was granted or not); and also serve as a platform to enable monitoring of trial progress and
evaluation, and support lifecycle management of technologies (including algorithms) once
adopted and deployed. Monitoring and lifecycle management are likely to include assuring that
trial or use takes place within approved parameters and subject to any conditions imposed, and
scheduling of regular reviews of a technology’s performance against its intended purpose. It may
also include scheduling of any associated reporting or other assurance loops that formed part of
the approved proposal.
In the case of proposals to trial a new technology, the Manager: Emergent Technology should
therefore be kept informed of trial progress, conclusion of the trial, and evaluated outcomes.
If a proposal to adopt (or operational y deploy) a technology is approved under this policy, the
Manager: Emergent Technology should be kept informed of any changes in use, withdrawal of
the technology, or other developments.
Any proposal to alter the way in which technology is used, after it has been adopted, is likely to
have a material impact on the Policy Risk Assessment (including, for example, assessment of
Proportionality) and wil require further governance approvals based on updated documentation.
2 0 | P a g e