Tag Archives: consent

Transparency

Technology must be transparent in order to be knowable. Technology must be knowable in order for us to be able to consent to it in good faith. Good faith informed consent is necessary to preserving our (digital) autonomy.

Let’s now look at this in reverse, considering first why informed consent is necessary to our digital autonomy.

Let’s take the concept of our digital autonomy as being one of the highest goods. It is necessary to preserve and respect the value of each individual, and the collectives we choose to form. It is a right to which we are entitled by our very nature, and a prerequisite for building the lives we want, that fulfill us. This is something that we have generally agreed on as important or even sacred. Our autonomy, in whatever form it takes, in whatever part of our life it governs, is necessary and must be protected.

One of the things we must do in order to accomplish this is to build a practice and culture of consent. Giving consent — saying yes — is not enough. This consent must come from a place of understand to that which one is consenting. “Informed consent is consenting to the unknowable.”(1)

Looking at sexual consent as a parallel, even when we have a partner who discloses their sexual history and activities, we cannot know whether they are being truthful and complete. Let’s even say they are and that we can trust this, there is a limit to how much even they know about their body, health, and experience. They might not know the extent of their other partners’ experience. They might be carrying HPV without symptoms; we rarely test for herpes.

Arguably, we have more potential to definitely know what is occurring when it comes to technological consent. Technology can be broken apart. We can share and examine code, schematics, and design documentation. Certainly, lots of information is being hidden from us — a lot of code is proprietary, technical documentation unavailable, and the skills to process these things is treated as special, arcane, and even magical. Tracing the resource pipelines for the minerals and metals essential to building circuit boards is not possible for the average person. Knowing the labor practices of each step of this process, and understanding what those imply for individuals, societies, and the environments they exist in seems improbable at best.

Even though true informed consent might not be possible, it is an ideal towards which we must strive. We must work with what we have, and we must be provided as much as possible.

A periodic conversation that arises in the consideration of technology rights is whether companies should build backdoors into technology for the purpose of government exploitation. A backdoor is a hidden vulnerability in a piece of technology that, when used, would afford someone else access to your device or work or cloud storage or whatever. As long as the source code that powers computing technology is proprietary and opaque, we cannot truly know whether backdoors exist and how secure we are in our digital spaces and even our own computers, phones, and other mobile devices.

We must commit wholly to transparency and openness in order to create the possibility of as-informed-as-possible consent in order to protect our digital autonomy. We cannot exist in a vacuum and practical autonomy relies on networks of truth in order to provide the opportunity for the ideal of informed consent. These networks of truth are created through the open availability and sharing of information, relating to how and why technology works the way it does.

(1) Heintzman, Kit. 2020.

Digital Self

When we talk about the digital self, we are talking about the self as it exists within digital spaces. This holds differently for different people, as some of us prefer to live within an pseudonymous or anonymous identity online, divested from our physical selves, while others consider the digital a more holistic identity that extends from the physical.

Your digital self is gestalt, in that it exists across whatever mediums, web sites, and services you use. These bits are pieces together to form a whole picture of what it means to be you, or some aspect of  you. This may be carefully curated, or it may be an emergent property of who you are.

The way your physical self has rights, so too does your digital self. Or, perhaps, it would be more accurate to say that your rights extend to your digital self. I do not personally consider that there is a separation between these selves when it comes to rights, as both are aspects of you and you have rights. I am explicitly not going to list what these rights are, because I have my own ideas about them and yours may differ. Instead, I will briefly talk about consent.

I think it is essential that we genuinely consent to how others interact with us to maintain the sanctity of our selves. Consent is necessary to the protection and expression of our rights, as it ensures we are able to rely on our rights and creates a space where we are able to express our rights in comfort and safety. We may classically think of consent as it relates to sex and sexual consent: only we have the right to determine what happens to our bodies; no one else has the right to that determination. We are able to give sexual consent, and we are able to revoke it. Sexual consent, in order to be in good faith, must be requested and given from a place of openness and transparency. For this, we discuss with our partners the things about ourselves that may impact their decision to consent: we are sober; we are not ill; we are using (or not) protection as we agree is appropriate; we are making this decision because it is a thing we desire, rather than a thing we feel we ought to do or are being forced to do; as well as other topics.

These things also all hold true for technology and the digital spaces in which we reside. Our digital autonomy is not the only thing at stake when we look at digital consent. The ways we interact in digital spaces impact our whole selves, and exploitation of our consent too impacts our whole selves. Private information appearing online can have material consequences — it can directly lead to safety issues, like stalking or threats, and it can lead to a loss of psychic safety and have a chilling effect. These are in addition to the threats posed to digital safety and well being. Consent must be actively sought, what one is consenting to is transparent, and the potential consequences must be known and understood.

In order to protect and empower the digital self, to treat everyone justly and with respect, we must hold the digital self be as sacrosanct as other aspects of the self and treat it accordingly.

Consent

I was walking down the platform at the train station when I caught eyes with a police officer. Instinctively, I smiled and he smiled back. When I got closer, he said “Excuse me, do you mind if I swipe down your bag?” He gestured to a machine he was holding. “Just a random check.”

The slight tension I’d felt since I first saw him grabbed hold of my spine, shoulders, and jaw. I stood up a little straighter and clenched my teeth down.

“Sure, I guess,” I said uncertainly.

He could hear something in my voice, or read something in my change of posture. “You have to consent in order for me to be allowed to do it.”

Consent. I’d just been writing about consent that morning, before going to catch the train down to New York for Thanksgiving. It set me on edge and made more real what was happening: someone wanted to move into my personal space. There was now a legal interaction happening. “I don’t want to be difficult, but I’d rather you didn’t if you don’t have to.”

“It’s just a random check,” he said. “You don’t have to consent.”

“What happens if I say no?”

“You can’t get on the train,” he gestured to the track with his machine.

“So, my options are to let you search my bag or not go see my family for Thanksgiving?”

“You could take a bus,” he offered.

I thought about how I wanted to say this. Words are powerful and important.

“I consent to this in as much as I must without having any other reasonable option presented to me.”

He looked unconvinced, but swiped down my bag anyway, declared it safe, and sent me off.

Did I really have the right to withhold consent in this situation? Technically, yes. I could have told him no, but I had no other reasonable option.

At the heart of user freedom is the idea that we must be able to consent to the technology we’re directly and indirectly using. It is also important to note that we should not suffer unduly by denying consent.

If I don’t want to interact with a facial recognition system at an airport, I should be able to say no, but I should not be required to give up my seat or risk missing my flight spending exceptional effort as a consequence of refusing to consent. Consenting to something that you don’t want to do should not be incentivized, especially by making you take on extra risk or make extra sacrifices.

In many situations, especially with technology, we are presented with the option to opt out, but that doesn’t just mean opting out of playing a particular game: it can mean choosing whether or not to get a life saving medical implant; not filing your taxes because of government mandated tax software; or being unable to count yourself in a census.

When the choice is “agree or suffer the consequences” we do not have an option to actually consent.