When reading the terms and conditions isn’t enough

Written by

Privacy Foundation NZ

Published on

Commentary and Articles

Everyone knows it’s important to read the small print before signing up to something – but how many of us actually do it? Ts and Cs aren’t always that user-friendly, particularly if you’re trying to read them on the small screen of your phone. Even if we can read them, understanding what they’re on about isn’t always straightforward. And if we’re in a hurry, we’ll almost invariably click the “I Agree” button without actually looking at the document carefully (or at all).

As more and more decisions about us are made by computers rather than by humans, it becomes even less obvious how those decisions are made and whether they’re right. Reading the terms and conditions won’t necessarily help that much either. That’s why the Privacy Commissioner recently commented on Stuff that the new Privacy Act could include a provision requiring agencies to properly explain the basis for automated decisions.

We also think a new provision along these lines is needed. It’s another example of where our current Privacy Act is falling behind the privacy legislation of our major trading partners (particularly Europe). Algorithms can create or enhance biases. They may lead to wrong results, with significant effects on people’s lives (for instance see the problems with the Centrelink debt recovery program in Australia, where human checks had been taken out of most of the process). Unless the basis for the automated decision has to be explained – such as spelling out what information was used in making that decision – people may be powerless to get mistakes fixed before it’s too late.

Of course, the exact shape of the legal provision will need some further thought. Disclosing a formula in itself doesn’t tell most of us anything, and there can be issues with commercial confidentiality too. We also need to make sure we’re not going to unjustifiably restrict innovation, for instance in the artificial intelligence field. But the new European provision is a good starting point for the debate about what we should have in our new legislation.

In the Stuff article, the Commissioner also commented that businesses still need to have a valid purpose for getting the information in the first place.  This is completely right, and it’s something that’s often overlooked in the dash for data. Having terms and conditions that say people consent to what you want to do doesn’t get a business off the hook of many of the privacy rules. People can’t ‘consent’ to a breach of most of an agency’s privacy obligations. This includes the obligation to get only the information the business genuinely needs to do the job, and the requirement to take reasonable steps to check that information is right before it gets used.

So it will be important to make very sure your algorithm is up to scratch.

Katrine Evans