The title of my chapter is “(mis)gendering”. When the editors chose this and offered it to me, I’m not sure what they expected. But I chose to interpret it literally – as reference to the instability of gender, and the difficulty of drawing a clear line between gendering and misgendering. As reference to Lauren Berlant’s idea that “recognition is the misrecognition you can bear”.
So, that’s what the chapter is about: what misrecognition you can bear, and how what you can bear changes from situation to situation. More computationally-oriented people, and the systems they produce, want a single answer to how to recognise someone – in this case, how to gender them. Contradiction and context are things to be corrected. But for a lot of us, contradiction and context can be lifesaving. Gendering someone to their family or doctor as you do to their friends can make trouble, not recognition. It can get people hurt. And in this widespread datalogical drive towards what John Law calls a “one-world world”, technologists tend to forget that there are many worlds, and that collapsing the gaps between them, algorithmically or otherwise, can harm.
I wish I had some deeply insightful commentary on or reflection about this piece – but, truth be told, it feels like it was written by a very different person, and for a very different person. I am glad to have been that person, if it was a necessary prerequisite to who I am now – and I am glad to have written this chapter, and grateful to have been invited in the first place, too. When Nanna first offered me the chapter, she also offered me community, and conversation, and, well, recognition. As contextual and contingent as all those concepts are, I will forever be grateful for them, whether the book sells a million copies, or ten.
Maybe that, then, is the reflection, and the lesson. That this book - the collaborative, relational process through which it has been written - is a prefigurative act, and a model for what making sense of, adapting to and resisting the datalogical turn might look like.