Wednesday, January 18, 2017

Mathematical values

As with so many things, writing about Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy by Cathy O'Neill feels a different, less hopeful, project since the November election.

O'Neill is a math scientist, a former financial markets quant, who took her unease to Occupy Wall Street. In this book she explicates what she calls the "the dark side of Big Data." What she calls "WMD" are the algorithms that have so much impact, whether we know it or not, on how we live. Separate chapters delve into how US News generated simple and rather stupid scores that higher education institutions game for prestige and cash, how courts use unscientific predictions of possible recidivism rates to decide criminal sentences, how companies sort job applicants mathematically and then monitor the work they do once they are hired, how credit and insurance decisions are governed by algorithmic ratings, and how politicians use data to influence voters. Over the last fifteen years, the sophistication of all these systems has increased so much as to almost exclude any human judgement in their day-to-day operations. Your life and mine is hedged in by mathematical models from which there is little, if any, appeal.

And yet, this is not a Luddite book, a tract denouncing the systems that give us so much we want and perhaps need, as well as dehumanizing and controlling us. She knows that going backward is impossible. Given the choice between Facebook and no internet, streaming entertainment and network television, Amazon's universe of consumer choices and the department store at the mall, we know what human beings will choose. She insists that, with WMDs,

... the heart of the problem is almost always the objective. Change that objective from leeching off people to helping them, and a WMD is disarmed -- and can even become a force for good.

O'Neill is terribly clear why, structurally as well as because of bad intentions, it is hard to embed humane objectives within algorithms.

... human beings learn and adapt, we change, and so do our processes. Automated systems, by contrast, stay stuck in time until engineers dive in to change them. If a Big Data college application model had established itself in the early 1960s, we still wouldn't have many women going to college. ...

Big Data processes codify the past. They do not invent the future. Doing that requires moral imagination, and that's something only human can provide. We have to explicitly embed better values into our algorithms, creating Big Data models that follow our ethical lead. Sometimes that will mean putting fairness ahead of profit. ...

What does she think can be done? She harks back to state and federal regulation from the early 20th century forward that, partially, guaranteed health and safety of goods and services, despite costing corporations some of their bottom line. (We know the corporations often responded by offshoring their worse goods and practices, but that's the next phase.) She calls on math scientists to develop their own ethical code for their creations. And, ultimately, she looks to law to inject values into applied mathematics -- and insists this could happen. Algorithms should be subject to human auditing of their ethical implications. She reports hopeful initiatives.

Movements toward auditing algorithms are already afoot. At Princeton, for example, researchers have launched the Web Transparency and Accountability Project. ... Academic support for these initiatives is crucial. ... If you consider mathematical models as the engines of the digital economy -- and in many ways they are -- these auditors are opening the hood and showing us how they work.

... Finally, models that have a significant impact on our lives ... should be open and available to the public. Ideally we could navigate them at the level of an app on our phones. ... The technology already exists. It's only the will we're lacking.

Unhappily, that will seems likely to stay lacking under the GOPer/Trump regime. These folks are more likely to love them some data that enables them to identify, track and hurt those they consider their enemies or just beneath their concern.

But I'm still with O'Neill on the basic thrust of this book. We don't win more justice by going backwards. We have to figure out how to control our tools as we go forward.

O'Neill blogs are MathBabe, and even a mathematical illiterate like myself can almost take it in.

No comments: