Our responsibilities towards data privacy, with Ben Byford; on Data Protection Day

In celebration and to raise awareness of National Data Protection Day we invited Ben Byford, part of the Cambridge Spark faculty, to share his professional thoughts on data protection, privacy and AI ethics.

You wake up in the morning. You check your phone, then ask Alexa to play the radio. Maybe you go for a run, you track your weekly record, maybe even listen to a podcast (The Machine Ethics Podcast perhaps?). Welcome to the 21st century where we exhaust data. It is hoovered up, and then processed so predictions can be made about us by hundreds of companies.

Constantly.

As Carissa Véliz puts it at the beginning of her book 'Privacy Is Power', “can we be sure that your data will not be used against you?” What does that data say about you? Who actually owns your data? And indeed, what assumptions are people making with our data about us all?

The idea of privacy is now synonymous with this type of personal data; the data we co-create and share with companies. Our data can predict our likelihood to vote and for whom, or confirm our disposition to commit future crime (Minority Report writ large). In this new world privacy is the concern of everyone.

Before personal computers and smartphones, personal privacy featured in the Universal Declaration of Humans Rights created post World War II. For example, privacy should not “be subjected to arbitrary interference … nor to attacks upon his [sic] honour and reputation.” This inclusion is important as the lack, or inequality of privacy can have a tendency to erode other Human Rights like the “right to freedom of thought, conscience and religion”, “the right to freedom of peaceful assembly and association”, and “the right to freedom of opinion and expression”. It is much easier to suppress people when they are telling you what they think, like, or doing. It’s also much easier to sell directly to them, or to nudge them into thinking something you want them to think. It can become a slippery slope.

Having enough privacy facilitates the conditions for democracy, self-actualisation, surprise and delight, as well as forgetting, and indeed, growing as an individual. This I find particularly important in an era where data, or 'facts', about a person can be collected and enshrined digitally for an aeon. What were we doing on those forums 20 years ago that might be brought to bare when applying for your next job? What dogma is that data applying to you now a different person?

As workers in companies, we must be worthy of trust, to be held to the highest scrutiny to protect and use data ethically. As data scientists and designers of services we are often collecting and utilising data, often personal data. Data can have huge potential helping us discover new phenomena, cure and protect people. But it can also be manipulated, misused, and hacked. Though GDPR has gone a long way to help put together a framework around personal data use and security, it is still important to strive for privacy and responsible data usage, especially in fast moving sectors and technologies. We must not let the little trust we have left run out.

As Mark Coeckelbergh neatly puts it in his book AI Ethics: “An Ethical use of AI requires that data are collected, processed, and shared in a way that respects the privacy of individuals and their right to know what happens to their data, to access their data, to object to the collection or processing of their data, and to know that their data are being collected and processed and… that they are subject to a decision made by an AI.”

As a data scientist, I use data everyday, coming up with hypotheses, predictions and may even lean on methodologies like Privacy by Design or Value Sensitive Design in workflows. All in order to create predications, insights and new services. We mustn’t lose sight of why privacy and data protection is important purely to drive profits or overcome social pressures. The digital information revolution isn’t going away, so we must try and shape our future into one that upholds our Rights, the Rights we wish to live by.

About Ben Byford:

Ben Byford runs the Machine Ethics Podcast, teaches, consults, writes and talks on AI and Ethics. Ben lives in Bristol and can be contacted here: hello@ethicalby.design

ben1

Notes to editor: 

For more information, please contact Josie Jakub, Head of Marketing by email josie@cambridgespark.com  or visit our website.

About Cambridge Spark:

At Cambridge Spark each of our courses teaches the best practices of dealing with data–including data protection and the ethics of artificial intelligence. We work with specialist teachers and have a talk series with external leaders that focuses on data best practices..

Cambridge Spark is an education technology company that enables organisations to achieve their business goals by educating their workforce in Data Science; Artificial Intelligence. Cambridge Spark is the only specialist data science apprenticeship provider in the UK, offering a full-stack skills solution for data science at every level of an organisation; all powered by cutting-edge modular curriculum, expert-teaching and engaging learning technology; EDUKATE.AI®.

Enquire now

Fill out the following form and we’ll contact you within one business day to discuss and answer any questions you have about the programme. We look forward to speaking with you.

Photo

Talk to us about our Data & Ai programmes