IAPP Panel Discussion - AI – designing for privacy and trust
Remarks by President Michael Cope to the panel discussion held by International Association of Privacy Practitioners in Brisbane on 29 March 2023
When it comes to information privacy, the dystopian nightmare is not that of George Orwell but Kafka. In The Trial the problem is the character’s powerlessness and vulnerability created by the state’s use of data and its denial to the central character of knowledge of what information it has about the character or of the capacity to participate in the process by which that knowledge is used.
It is this nightmare we have come to discuss today in the form of the use of ‘AI’ to make decisions that could impact a person’s access to benefits or services, or could be used to make a judgement about whether or not they are law abiding
In what follows i draw heavily on the Report of the House of Lords Justice and Home Affairs Committee entitled “Technology rules? The advent of new technologies in the justice system” published 30th of March 2022
The committee found and i quote “We uncovered a landscape, a new Wild West, in which new technologies are developing at a pace that public awareness, government and legislation have not kept up with.”
The evidence to the committee was that public bodies often do not know much about the systems they are buying and do not have the expertise to evaluate them.
Is there any reason for thinking that things are different here? I don’t think so.
Like the Committee QCCL is concerned that, the use of AI may impact adversely an individual's right to a fair trial: whether because they are not aware that the technology is being used and even when the accused is aware that technology is in use most accused with limited resources are not going to be able to fully understand the technology, let alone challenge its use.
Another use of AI in the criminal justice system is the use of predictive tools in sentencing. In 2016 the Wisconsin Supreme Court put limits on the use of such tools holding that a predictive assessment could be considered in a sentencing but not be used (1) to determine whether an offender is incarcerated; or (2) to determine the severity of the sentence or (3) as the determinative factor in deciding whether an offender can be supervised safely in the community.
I turn now to discuss the regulation of this technology generally.
The House of Lords committee recommended the establishment of a body to supervise the use of these technologies, recommending that all such technologies should be subject to comprehensive impact assessments each time the tool is to be used for a new purpose.
The body should keep a public register of algorithms used in relation to the application of the law. Users should be able to find information about the technology being deployed, who is deploying it where, on what occasions, and for what purposes.
It is in our view fundamental that human decision-makers should retain overall control of these processes
The UK General Data Protection Act 2018, provides that a person “shall have the right not to be subject to a decision based solely on automated processing, including ‘profiling', which produces legal effects concerning him or her or similarly significantly affects him or her.
We need a similar law. But the humans involved must have ‘meaningful' influence on the decision, including the ‘authority and competence' to go against the recommendation of the tool.
I accept that humans themselves are biased and flawed, which is why we need the other safeguards i have mentioned.
It is our view that some technology can never be safely used, in which category we include live facial recognition in places to which the public has access be they private or publicly owned. In our view that use of live facial technology in public places should be banned. As Silkie Carlo of UK Big Brother Watch has said, “Allowing cameras that can basically operate as anything between identity checkpoints and police line-ups in real time, does not fit with any sensible conception of privacy.”
In the final episode of his brilliant TV series Civilisation Kenneth Clarke, speaking of our increasing reliance on machines, said “They have ceased to be tools and have begun to give us directions. And unfortunately, machines, from the Maxim gun to the computer, are for the most part means by which a minority can keep free men in subjection.”
During the last 50 years some of us thought that perhaps this prognostication didn’t apply to contemporary technology. the internet in particular was seen as an anarchic place where no one could rule. But it is clear now that notwithstanding the enormous improvement these technologies have brought and will bring to our lives they have the potential, if not properly controlled, to bring an end to some of our most basic civil liberties