Technology, consent and privacy

Nobody (even at their age!) should struggle this much to read a situation.Tonight my fellow faculty member at Academy of the Impossible, Ramona Pringle, hosts an awesome-sounding event (which I sadly cannot attend) called The Connection Paradox: Creating a Social Blueprint. The event’s purpose is to flesh out an idea of how we want to live with technology and with each other. Since I can’t be there, she asked me to send her a few thoughts about the issue, and because of the things I often blog about, my thoughts automatically drifted to gender, privacy and consent as they relate to technology. Here are some of my jottings on the subject.

The dominant fear-mongering rhetoric around privacy and technology has given people an excuse to violate others’ digital privacy in ways they likely know (at least in the back of their minds) are immoral. Because the prevailing wisdom is “nothing is private in digital space”, our culture’s collective response to things like the sharing of intimate images is “well, she shouldn’t have sent him the photo in the first place”. This collective response flies in the face of the distinction between one-to-one and one-to-many communication. If a person chooses to send something via a one-to-one channel (or one-to-a-select-few, such as a group chat with 2 others), the tacit message is that the communication is for this person (or these people) alone. If a communicator wants a thought or image to be widely shared via one-to-many, they will do so themselves. I believe when people distribute intimate images that were texted or emailed to them alone, they know in their hearts that they are hurting and violating the other person, but our collective wisdom justifies their decision to share. That collective wisdom is what needs to change.

I feel this is a manifestation of some people’s belief that technology is an “objective” entity that functions and self-moderates automatically as its own animal, independently of human beliefs, values and behaviour. It strikes me that this is not unlike how people tend to think about the “free market”. I so disgree with this characterization. We are technology – we are the ones who make it, who use it, who decide how it can and should be used (though of course, we aren’t able to predict and control that use absolutely). We can set new and different standards of behaviour and use. IMHO, one of those “new” standards should be what is really a pretty basic/ancient moral code: if someone tells you something privately (especially if aspects of it are sensitive and/or could be harmful to anyone, whether they are involved or not), unless the communicator asks you to share or it’s in the public interest to share, keep it to yourself.

How to implement this standard? I dunno (sorry). It’s a complex cultural issue. In the case of revenge porn I believe the problem is also shrouded in misogynistic ideology that privileges public access to bodies (especially women’s bodies). I do think it would help to start teaching kids about consent as an important subset of how we educate them about privacy in a technology context. This teaching doesn’t have to apply solely to digital violations of a sexual nature (like revenge porn). For example, consider a situation in which a teenager confides to a friend about their crush via one-to-one chat, and the friend posts a Facebook status about it.

Consent should be an integral part of how we educate about privacy, but I think many parents and educators (not to mention the media) would be hesitant to do so. Why? Because it might in some ways qualify or mitigate (and perhaps in some folks’ eyes, undermine) the dominant, hand-wringy messages about BEING CAREFUL WHAT YOU POST because NOTHING IS PRIVATE ANYMORE. But I think it’s necessary. Educating about privacy shouldn’t just be about protecting our own privacy, but also about not violating the privacy of others. And this learning should start early.

3 thoughts on “Technology, consent and privacy

  1. Is this justice for Rehtaeh? – Steph Guthrie

    • Tough question, and would probably be a “case-by-case basis” kind of thing. But I’d say if harm will be done to others by the info NOT being shared, that’s one thing that would clinch it for me (e.g. information about abuse, information about pollution of drinking water, that kind of thing).

Comments are closed.