1. 程式人生 > >What Does a Fair Algorithm Actually Look Like?

What Does a Fair Algorithm Actually Look Like?

Machine learning tools are designed to detect patterns, and they often reflect back the same biases we already know exist in our culture. Algorithms can be sexist, racist, and perpetuate other structural inequalities found in society. But unlike humans, algorithms aren't under any obligation to explain themselves. In fact, even the people who build them aren't always capable of describing how they work. That means people are sometimes left unable to grasp why they lost their health care benefits, were declined a loan, rejected from a job, or denied bail--all decisions increasingly made in part by automated systems.