A Penny For Your Bytes: Next Gen Rights Defenders

  • Flawed understanding of data rights leaves consumers to fend for themselves
  • In software is king word, developers hold great deal of responsibility, and yet…

A Penny For Your Bytes: Next Gen Rights Defenders

I swear to fulfill, to the best of my ability and judgment, this covenant:

[...]

I will respect the privacy of my patients, for their problems are not disclosed to me that the world may know. Most especially must I tread with care in matters of life and death. If it is given me to save a life, all thanks. But it may also be within my power to take a life; this awesome responsibility must be faced with great humbleness and awareness of my own frailty. Above all, I must not play at God.

I will remember that I do not treat a fever chart, a cancerous growth, but a sick human being, whose illness may affect the person's family and economic stability. My responsibility includes these related problems, if I am to care adequately for the sick.

I will prevent disease whenever I can, for prevention is preferable to cure.

I will remember that I remain a member of society, with special obligations to all my fellow human beings, those sound of mind and body as well as the infirm.

If I do not violate this oath, may I enjoy life and art, respected while I live and remembered with affection thereafter. May I always act so as to preserve the finest traditions of my calling and may I long experience the joy of healing those who seek my help.

Modern Hippocratic Oath - Louis Lasagna

A Penny For Your Bytes: Next Gen Rights DefendersThe current technology paradigm is faulty at its core. As we discussed in the previous episode, a flawed understanding of the nature of data has led us to a path where we are not proactively protecting citizens and their digital twins, the models created by all the data we frantically and insatiably collect about them.

Let’s picture for a moment two doctors meeting to discuss an oncology treatment. We are talking taxonomies here: technical medical language describing the illness and its characteristics, treatment drugs, the targeted systems, side effects, reactions, devices involved in the delivery and monitoring and so on..

Do we see patients engaging in those conversations? No we don’t. We leave the treatment design to the experts and, at most, we make the conscious decision to go down this or that path once we have been duly informed. The technical details? None of our business. Doctors take an oath to proactively protect their patients’ lives to the best of their abilities and they can do so because they have the language to describe the illness, how to eliminate it and anything in between.

Now imagine a similar scenario with architects, car engineers or any other complex system. It’s not hard to see how we have understood a long time ago that we need to leave the complexities to the experts and concentrate on being responsible citizens in the use of the technology they produce for us.

What about digital security? As citizens and technology consumers we are forced to be the ones making all sorts of technical decisions to “protect ourselves”.

A recent Google Media Release points out “Almost 3 in 5 people have experienced a personal data breach or know someone who has, yet 93% persist with poor password practices”.

First of all, the framing is rich: it’s your fault for creating weak passwords and it's even more your fault that you double down on having poor passwords that you are likely repeating across platforms. Or that you don’t use multi factor authentication (aka 2FA/MFA). It’s also your fault if your car manufacturer built a poorly designed key system and you didn’t take the initiative to go upgrade it. Or if you entered a building and died in a fire because you didn’t check yourself that the fire sprinklers were defective. Of course it is.

What’s inherently wrong with this narrative is that the burden of observance is constantly thrown over the shoulders of the citizens. This is eminently not a sustainable approach and the core reason why people can’t be bothered with how their data is extracted and manipulated: it’s too much and it requires a degree of technical proficiency that only a trained expert can understand.

The obvious observation here is answering the question: Who builds all this stuff we are so much concerned about? To no one’s surprise, the answer is technologists; in particular software developers hold a great deal of responsibility as any tech device will always be interacted through some sort of software interface.

Unfortunately, programmers do not currently have the necessary language to proactively protect citizens. There is, for instance, no agreed upon taxonomy on the digital harms that data (in the form of our digital twins) can be subject to. How can they design systems that will protect citizens if they can’t define those potential problems to avoid them in the first place? How can we inspire them to adhere to a technological Hippocratic Oath if we can’t even define what it is they need to protect? And yet we desperately need their active participation to ensure technology protects us all the time.

It is no surprise that at The IO Foundation we regard programmers as the NextGen Rights Defenders. They hold the keys to better and safer technology that will ensure both our Human and Digital Rights and we need to work on updating their educational pipeline with the proper technical language (taxonomies, among others things) for them to embrace this role.

So what’s next?

In the next episode we’ll dive on TIOF’s DCDR Principles and how developers can start changing their daily paradigm and embrace their role as NextGen Rights Defenders to build technology that is protective by design.


Jean F. Quéralt founded The IO Foundation in 2018 as an organisation dedicated to promote, protect and provide solutions for Digital Rights.

 

Related Stories :

 
 
Keyword(s) :
 
Author Name :
 
Download Digerati50 2020-2021 PDF

Digerati50 2020-2021

Get and download a digital copy of Digerati50 2020-2021