Ensuring Equitable Outcomes from Automated Hiring Processes: An Update

By Antranig Basman, Raising the Floor - International

Several months ago, I wrote a piece about Ensuring Equitable Outcomes from Automated Hiring Processes questioning what roles would be taken by those corporations deploying automated systems as part of the infrastructure of society. I highlighted that there was a familiar trajectory to this process, whereby those corporations acting against the public interest were eventually made subject to oversight and regulation. After this it was found that it was perfectly possible to operate these systems at a profit, despite the corporations insisting that this oversight was incompatible with the demands of capitalism. In that posting, I appealed to a familiar example from the history of technology, the operation of steamboats on the Hudson River and Mississippi, whose captains insisted that they could not be made safe at a price the public could afford.

I am disappointed to report that our largest partners on the We Count Optimizing Diversity with Disability project have proved every bit as obstructive as the steamboat captains of the 1820s. Despite repeated requests over many months, they have refused to make their systems available for inspection so that communities can assess them for discriminatory algorithmic bias against persons with disabilities and other minorities.

To date, their assistance has been confined to routine answers to queries with information that is already in the public domain, together with grants of free cloud computing resources. The latter might be useful to us if the project was in the business of developing its own algorithmic hiring platforms, but this would be an inefficient use of our project’s limited resources. This phase of the ODD project takes the form of a “planning grant,” whereby we are drawing up the parameters of a more substantial project that will develop methods to assess the hiring platforms already established in the industry for systematic algorithmic bias, and to suggest alternative data collection and algorithmic processes that might remedy that bias. Without assistance from our largest partners at this stage, our work becomes impossible — we are now eight months into a twelve-month project and have received no offers of access or technical details of these hiring systems whatsoever. If this obstruction continues, no algorithmic audit will be possible in a follow-on project, which is a result one imagines in practice that our major partners are hoping for, despite their verbal commitment to promoting inclusiveness and fairness in the hiring process.

Instead, our partners hide behind such platitudes as “AI for good,” a framing which MIT researchers Catherine D’Ignazio and Lauren Klein note in chapter 5 of Data Feminism, lacks precision and in practice ensures continued dominance by already empowered groups. Instead, the authors recommend that groups engage in co-liberation, which involves knowledge transfer from the privileged groups to those at the margins, and assistance in building social infrastructure to interpret that knowledge. Without infrastructure to support marginalized groups in being constantly involved in the process of critique of the systems that exclude them, inclusion is impossible.

Working with people with disabilities in our community, we have repeatedly heard a clear and resonant theme: that they feel they have been excluded from hiring processes as the result of automated systems. They’ve told us that visibility and insight into the nature of these algorithms and their potential impacts on them is essential, and they want to be part of the process of making decisions about how to improve them. Our research, and even that of some of these corporate providers of hiring tools, have suggested that this sort of visibility and participation is indeed viable. Their refusal to participate in efforts to share insights and visibility into their systems and their own lack of efforts to address bias and exclusion are actively creating an issue of trust.

As David Weinberger has remarked, “transparency is the new objectivity,” and we call upon our partners to open up their systems to the transparency that is demanded by their significant role in shaping society.

Tags