In Allegheny County, decision-making algorithms help make important decisions like when to conduct risk assessments for child welfare services and where to deploy police officers.
Such algorithms are created to make swift and neutral decisions that are more informed and consistent than those of humans and are expected to become more prevalent in decision making in the years ahead.
The problem is, algorithms are not neutral. If an algorithm’s code reflects the biases of its human creator, or it was built using data from biased practices, it can exacerbate the same problems it was designed to solve.
The Pittsburgh Task Force on Public Algorithms, convened by the University of Pittsburgh Institute for Cyber Law, Policy, and Security to prevent bias in automated decision-making systems, will host its first public meeting on March 10, 5:30-7:30 p.m., at the Homewood-Brushton Branch YMCA.
The task force is an independent coalition of researchers, advocates and public and private sector stakeholders and seeks to establish best practices and practical guidance for municipalities seeking to ensure algorithmic accountability and equity for all residents.
Its work is assisted by an advisory panel featuring representatives from Allegheny County and the City of Pittsburgh and supported by Common Cause Consultants and Weiss Burkhardt Kramer, LLC.
Register for the community meetings online.