(AP Photo/Mark Lennihan)
As more cities embrace big data, algorithms play a growing role in the day-to-day activities of civic life, determining things like which school a child will attend or whether a person is likely to be allowed out of jail with bail. But all data comes from human sources, and humans are, of course, biased. A new bill passed Monday by New York City Council aims to correct some of those innate biases and make sure the city’s computerized systems are, in fact, fair.
The bill would establish a task force to monitor the algorithms used by municipal agencies, Tech Crunch reports. It currently awaits the mayor’s signature, and is being supported by the New York ACLU.
“Flawed code can … further entrench systemic inequalities,” the organization wrote in a recent release. “The algorithms used in facial recognition technology, for example, have been shown to be less accurate on black people, women and juveniles, putting innocent people at risk of being labeled crime suspects.”Read Complete Article