I’ve been invited to open the Seattle University Department of Mathematics colloqium series, and am going along to talk about my work on trans-exclusionary AI! Things kick off at 3:45pm on 4 October in Bannon 401. Full blurb:
Machine learning has been heralded as the solution to a range of problems, from cancer screening to information security. As it has become more prevalent, researchers have identified severe biases in how algorithms are designed, along the axes of race, gender and disability, with tremendous consequences for those caught up in models’ limitations. The conventional solution is “just add data”: amend the model to include a wider range of people to ‘learn’ from. But what do you do when the problem isn’t the wrong data, but the wrong question? In this talk, Os Keyes will explore Automated Gender Recognition, a technology that purports to recognise a person’s gender through photographic analysis, and the model of gender it incorporates. Using this as a case study in the application of machine learning to social contexts, they discuss the consequences this technology has for transgender people and the limitations inherent in taking a quantitative view of the world. Content warning: this presentation will involve discussion of gendered and racialised violence.
You can sign up on Facebook or just, well, show up :)