Human Rights in the Digital Age: What We Learned at RightsCon 2019

Rights Con 003 Beehive 3X1 2000X666

I was in Tunis last week for my first RightsCon, the global summit on human rights in the digital age. As I wandered amongst the crowd I had a Maori proverb stuck on repeat in my head: ‘What is the most important thing in the world? It is people, it is people, it is people’.

Our CEO Gawain and I have never been in such a friendly and culturally rich crowd. Nearly 3,000 participants from all over the world were busily flowing between the 450+ sessions that RightsCon put on over three days. The community is so accessible that it’s easy to start conversations with strangers, switching between casual chat and intense discourse. Combining that with the passion every participant shared for doing good, I got the same feeling I’ve had on the sunbaked fields of music festivals: it’s called love.

Rights Con 001 Bb Ext Sq 1600X1600
Me ambling amongst the RightsCon crowd in Tunis

I have attempted to distil an overwhelming wealth of conversation and thought into a handful of broad insights. If you want to dig into any of them in more detail, get in touch.

The Values Already Exist

We already have the Universal Declaration of Human Rights. We already have commonly agreed ethics. Why should these not apply equally to the digital shadows of our physical selves? When considering how to act in technological environments, such as online media and computer interaction, start with our society’s existing rules for good practice and apply them on the premise that every data point, button press and notification is an interaction with a real human, who shares the same fundamental rights as you and I.

This concept is summed up well by a slogan I saw on a tote bag hanging from the shoulder of a RightsCon delegate from the Digital Frontier Fund: ‘digital rights are human rights’. That’s how we feel, and the sentiment is reflected in our story from earlier this year: Digital Humans Need Digital Ethics.

Indeed, a similar message was published by UN Special Rapporteurs on human rights while attending the conference, stating, ‘Digital space is not neutral space. [...] Human rights law provides standards which governments, companies and others engaged in the design and management of the infrastructure of our digital spaces should respect in their work’.

Regulation is Coming

Out of the Wild West style of unfettered innovation of the internet era we have seen the emergence of very powerful tech companies who have often behaved as if they don’t recognise a link between digital entities, such as data, and the real humans connected to them.

Regulation is always slow to develop, behind rapid changes in the world such as innovation explosions, and that gap is unlikely to shrink in coming years. It is therefore down to the pioneers to take ownership of the consequences of their choices and actions – they’re on their own. Groundbreaking innovation is one thing; continuing to do something unethical while you wait for someone in authority to come round and tell you off is a dick-move.

Regulation has now started to arrive, with the likes of GDPR, and more is coming. Companies that have been abusing people in the digital realm will be punished for violating basic rights. That should be a strong incentive to rectify now, but it shouldn’t have to be the reason in the first place – doing the right thing is just the right thing to do.

Go for Gold Standards

Laws, norms and standards vary across many dimensions, such as by region or by circumstance. As the creator of a new product or service you can choose how your organisation will respect those different policies. For instance, you might vary your terms of service to suit the individual laws of each country in which you operate. However you do it, recognise that no law is perfect.

Organisations have the power and the opportunity to start from the best possible position when defining their values and terms of use. They can develop their own policies that are narrower or stronger than the existing laws or norms in each dimension of application. A few leaders (Microsoft was mentioned repeatedly in this respect) are setting a high bar by picking gold-standard principles from around the world and combining them into singular, global policy sets, applicable to all users in all uses. What’s more, being global and unified by default can bring the added advantage of making your tools and policies easier to deploy, update and maintain.

Tech has a Reputation Problem

The reckless exploitation of people and their data – by Big Tech, governments and others – has disillusioned a lot of people, eroding trust to a point of entrenched cynicism. Surely this is damaging to the rapid, explosive and disruptive innovation that tech pioneers pursue. So anyone working in this space must take some responsibility to win back that trust.

Similarly, leadership has traditionally been a desirable goal, driving young people to take their turns in new leadership roles, building the structures of the future. Tech leaders in particular have been heralded as modern heroes in recent years but are now increasingly becoming villains in the popular narrative. Now, a common perspective of young people is that leadership is an ugly state that is to be avoided.

How will the next generation fare without anyone willing to take the wheel?

Responsibility is Multi-layered

There was a lot of discussion around who should be held responsible for any negative consequences of a technology. When something new is created, the initial designers should consider its potential effects, as should those that go on to build it, as well as those who eventually put it out into the world or maintain it beyond that. But how should we apportion that responsibility across that spectrum? The answer seems to be layered.

I repeatedly heard concern for the idea of holding developers and designers responsible for moral judgement in the design phase, for the eventual effects that are realised from their creations. That’s not fair. Certainly, the companies themselves – that is their executive bodies – must take a large share of responsibility, and assign qualified people and protocols to their activities to ensure best practice.

Wherever we place liability for the outcomes of our technologies, everyone in the chain – including designers, developers, and all those that support them in their organisations – can and should partake in the moral discourse. Every one of them should recognise that they have some responsibility. More than that though, they have the power to act for good. The risk of taking a stand can be high, such as losing your job, but as a member of an organisation you are essentially a citizen within that community and must act for what you believe to be right, holding your employers to account.

Rights Con 002 Crowd Int 16X9 1600X900
Heading into the 'Beehive' at RightsCon in Tunis

...So What Now?

Ultimately, anyone acting in the digital realm has a duty to human rights, and to ethics more broadly, which they must fulfil on behalf of the humans in the loop. There are many ways in which this can materialise, but one thing that was frequently highlighted was the use of impact assessments, both in the design phase and at sensible intervals after release into the world.

Creators, such as companies, have a responsibility to embed rights and ethics into their processes; authorities, such as governments and industry-bodies, have a responsibility to ensure that the creators do that; and everyone else, including the end user, has a responsibility to be considerate in our use of the new technologies we encounter.

If we accept that the digital world is tied to the physical world, then actors who are abusing digital rights are already committing human rights violations. They shouldn’t have to wait to be told that this is unacceptable. Instead, organisations can adopt a values-first approach to every interaction they encounter. I know, socially and ecologically responsible behaviour should be obvious, default, sacred. But even when it’s not, maybe, just maybe, it is becoming trendy anyway. And that might be a good thing for our little planet.

What to read next?

Ben Bland

Chief Operations Officer