One of my fairly regular subthemes here is the increasing power of algorithms over our daily lives and what Ted Striphas has called “the black box of algorithmic culture”. So I am naturally interested in this interview with Cynthia Dwork on algorithms and bias — more specifically, on the widespread, erroneous, and quite poisonous notion that if decisions are being made by algorithms they can’t be biased. (See also theses 54 through 56 here.)

I found this exchange especially interesting:

Q: Whose responsibility is it to ensure that algorithms or software are not discriminatory?

A: This is better answered by an ethicist. I’m interested in how theoretical computer science and other disciplines can contribute to an understanding of what might be viable options. The goal of my work is to put fairness on a firm mathematical foundation, but even I have just begun to scratch the surface. This entails finding a mathematically rigorous definition of fairness and developing computational methods — algorithms — that guarantee fairness.

Good for Dwork that she’s concerned about these things, but note her rock-solid foundational assumption that fairness is something that can be “guaranteed” by the right algorithms. And yet when asked a question about right behavior that’s clearly not susceptible to an algorithmic answer — Who is responsible here? — Dwork simply punts: “This is better answered by an ethicist.”

One of Cornel West’s early books is called The American Evasion of Philosophy, and — if I may riff on his title more than on the particulars of his argument — this is a classic example of that phenomenon in all of its aspects. First, there is the belief that we don’t need to think philosophically because we can solve our problems by technology; and then, second, when technology as such fails, to call in expertise, in this case in the form of an “ethicist.” And then, finally, in the paper Dwork co-authored on fairness that prompted this interview, we find the argument that the parameters of fairness “would be externally imposed, for example, by a regulatory body, or externally proposed, by a civil rights organization,” accompanied by a citation of John Rawls.

In the Evasion of Philosophy sweepstakes, that’s pretty much the trifecta: moral reflection and discernment by ordinary people replaced by technological expertise, academic expertise, and political expertise — the model of expertise being technical through and through. ’Cause that’s just how we roll.

3 Comments

  1. It is indeed puzzling that otherwise intelligent people — especially those caught in the thrall of technology — can so consistently advocate for replacing human agency with blind processes, specifically, those we engineer to relieve ourselves of responsibility for decision making. Beyond exporting liability and human labor over decisions, designing sophisticated decision trees (algorithms) with sufficient input, expertise, and nuance is believed to offer a form of (near?) perfectibility surpassing anything humans can muster. In short, it’s elimination of human error by removing humans from the arena of activity (nobody does it better 'cause nobody's doing it anymore). Drones programmed to “decide” when to let fly with their armaments and self-driving cars are familiar contemporary examples.

    It may be worthwhile to remind the credulous that human cognition is foundationally approximate and imperfect. Our inability to perceive or remember accurately, compared to our tools and devices anyway, is misguidedly regarded as a problem to be rectified — or in the misapplied jargon of the computing era, a bug, not a feature. However, this is an unbridgeable gulf in attempts to model strong AI based on human cognition. One is based on layers of high-fidelity exactitude (toggles, essentially, on/off or 0/1), the other on failing continuously until something is found or developed that works well enough. That’s how we learn to walk and speak, with a longish training phase in early childhood continuing throughout life with a high degree of imperfection and faux pas — literally missteps.

Comments are closed.