Councils again Ignore Privacy
It has been revealed that the Toowoomba City Council is trialling behavioural recognition technology connected to CCTV cameras.
This is technology that is intended to analyse the behaviours of people taking place in front of the camera.
It results in people being deemed “abnormal” being harassed by authorities.
“How would you feel if you are innocently minding your own business in the mall and you are approached by the police in front of everyone because a machine thinks you look odd?”
Mr Cope, President of the QCCL said today “This sort of technology is straight out of 1984. We call upon the Council to say exactly what steps it is taking to protect privacy and whether it has consulted with the Privacy Commission.”
When it comes to CCTV Queensland Councils have consistently behaved as if the Information Privacy Act does not apply to them.
In the United States, behavioural recognition technology has been demonstrated to be racially and ethnically biased, with a tendency to over select racial minorities.[1]
“The algorithms which underlie this technology, have at their heart arbitrary concepts of what is normal. People are selected for attention by authorities on the basis of their supposedly abnormal behaviour or appearance.” Mr Cope said
Mr Cope said, “We would like to know what data from Council records is being fed in to this software and what data is being fed back into the databases. We object to the creation of vast databases.”
Behavioural recognition is already in use in American airports, where human beings make the assessment. This has resulted in many innocent people being harassed by security staff.
The problem is that when a machine makes the assessment, the pressure is on for the operator to take action, lest they be accused of negligence. The result is that the number of innocent people being accosted by the authorities will go up.
Mr Cope said “We call upon the Councils involved to undertake full consultation with the community before introducing this very concerning technology. It should be the subject of a full privacy impact assessment.”
· [1] Clare Garvie and Jonathan Frankle Facial-Recognition Software might have a racial Bias Problem – The Atlantic 7/4/16