July 31, 2018 at 08:12AM InfoQ New York Creates Task Force to Examine Automated Decision Making
Background
In December 2017, the New York City Council passed the country's first bill to demand accountability in how algorithms are used in city government. That bill mandated that a task force study how city agencies use algorithms. This task force will report on how to make these algorithms understandable to the public.
The original proposal by Council Member James Vacca, mandated that the source code for the algorithms be made public. Some policy experts warned that openness might create security risks, or provide a way for people to game the public benefits system. Technology companies argued that they might be required to disclose proprietary information. The disclosure requirement was dropped in favor of the task force.
What the Law States
An automated decision system is defined as "a computerized implementations of algorithms, including those derived from machine learning or other data processing or artificial intelligence techniques, which are used to make or assist in making decisions."
The law requires the task force to accomplish at least 6 goals in their final report. They need to identify which city agencies should be subject to review. They need to recommend procedures so that people affected by an algorithmic decision can request an explanation upon what the decision was based, as well as how adverse impacts can be addressed. They also should explain the development and implementation of a procedure in which the city may determine if an automated decision system used by a city agency "disproportionately impacts persons based upon age, race, creed, color, religion, national origin, gender, disability, marital status, partnership status, caregiver status, sexual orientation, alienage or citizenship status". Recommendations for processes for making information available for automated decision systems will allow the public to meaningfully assess how they work, and are used by the city, as well as the feasibility of archiving automated decisions and the data used.
The members of this task force are not limited to experts in algorithmic design and implementation, but can include people who understand the impact of algorithms on society. Meeting participants can be limited if it "would violate local, state or federal law, interfere with a law enforcement investigation or operations, compromise public health or safety, or result in the disclosure of proprietary information."
While the final report should be publicly available, no recommendation is required if it "would violate local, state, or federal law, interfere with a law enforcement investigation or operations, compromise public health or safety, or would result in the disclosure of proprietary information."
This task force has no legal authority to force, or penalize city agencies that do not comply with their recommendations.
Background to the Controversy
The investigation of bias and infringement on rights in algorithmic decision making is only beginning.
Predictive policing programs in Chicago and New Orleans are being scrutinized for violations of due process and privacy. The public is often unaware of the use of these tools.
Even the creators of algorithms often cannot fully explain how the software came to the conclusion that was reached.
Several city agencies are starting to use decision systems. The Fire Department uses the Risk-Based Inspection System (RBIS) to predict where fires might start. Part of the RBIS is the Fire Cast tool that uses data from five city agencies to analyze 60 risk factors to predict which buildings are most vulnerable to fire outbreaks. These buildings are then prioritized for inspections, the data being available to all the city's 49 fire companies.
The Police Department uses algorithms for the data obtained from body cameras and facial recognition.
Algorithms are also used by the Department of Transportation, the Mayor's Office of Criminal Justice, the Department of Education, and the Department of Social Services. Students are matched with schools. Teacher performance is assessed. Medicare fraud is investigated.
Problems with the Current Legislation
Julia Powels, a research fellow at NYU’s Information Law Institute as well as at Cornell Tech, described two problems with the task force's mission which resulted from a compromise between the original legislation and what passed.
First, if the agencies and contractors do not cooperate, good recommendations will not be made. There is no easily accessible information on how much the City of New York spends on algorithmic services, or how much of the data used is shared with outside contractors. The Mayor's office rejected any requirement for mandated reporting to be in the legislation based on the argument that it would reveal proprietary information. If too much leeway is given to claims of corporate secrecy, there will be no algorithmic transparency.
The other problem with the current law is that it is unclear how the city can change the behavior of companies that create automated-decision making systems. Frank Pasquale, a law professor at the University of Maryland, argues that the city has more leverage than the vendors.
Members of the Task Force
The members of this task force are not limited to experts in algorithmic design and implementation, but can include people who understand the impact of algorithms on society. It will be composed of individuals from city agencies, academia, law, industry experts, and nonprofits and think tanks. It is expected that representatives will be chosen from the Department of Social Services, the Police Department, the Department of Transportation, the Mayor’s Office of Criminal Justice, the Administration for Children’s Services, and the Department of Education
The task force is co-chaired by Emily W. Newman, Acting Director of the Mayor’s Office of Operations, and Brittny Saunders, Deputy Commissioner for Strategic Initiatives at the Commission on Human Rights.
Impact
New York City could have an impact with algorithms similar to California with auto emission standards. Being one of the largest cities in the world, it may make wide enough use of algorithms such that it might be easier to meet whatever standards it creates in all jurisdictions. Altering algorithms for different locations, however, might be easier with software than mechanical devices. This is illustrated by the ability of software to calculate different sales tax regulations in different states, cities, towns, counties, etc. though out the United States. On the other hand, New York is one of the most valuable sources of demographic data in the world. Restricting the use here, might encourage other locations to do the same.
In any case, the argument over the fairness of algorithmic decisions, and the need to use them, is not going away.