Originally published in The San Francisco Chronicle (the cover article of Sunday’s “Insight” section) What if the data tells you to be racist? Without the right precautions, machine learning — the technology that drives risk-assessment in law enforcement, as well as hiring and loan decisions — explicitly penalizes underprivileged groups. Left to its own devices, the algorithm will count a black defendant’s race as a strike against them. Yet some data scientists are calling to turn off the safeguards and unleash computerized prejudice, signaling an emerging threat that supersedes the well-known concerns about inadvertent machine bias. Imagine sitting
This content is restricted to site members. If you are an existing user, please log in on the right (desktop) or below (mobile). If not, register today and gain free access to original content and industry news. See the details here.