Sept. 15, 2021 - Law Center Assistant Professor Andrew C. Michaels was one of four University of Houston scholars who recently secured a $749,857 grant from the National Science Foundation for a Designing Accountable Software Systems funding opportunity. The grant will support a project that aims to create an accountability benchmark and software scoring toolkit for public policy algorithms.
Named the Community Responsive Algorithms for Social Accountability, or CRASA, the project will receive the grant over three years and seeks to establish a model for accountability that can be applied across a comprehensive range of public policy algorithms. It will be conducted through a community-based participatory research program focusing on Harris County, Texas, and will incorporate input from stakeholders in local government and the legal community and industry. Funding begins on Oct. 1.
In addition to Michaels, CRASA is being developed by principal investigator Ryan Kennedy, a UH political science professor who specializes in computational social science and democracy, and co-principal investigators Ioannis A. Kakadiaris, a UH computer science professor whose expertise is in biometrics and pattern recognition, and Lydia Tiede, a UH political science professor whose focus is judicial politics and legal reform.
At the Law Center, Michaels specializes in intellectual property and statutory regulation. He has written two articles related to artificial intelligence and law: “Abstract Innovation, Virtual Ideas, and Artificial Legal Thought,” published in the Journal of Business and Technology Law in 2018, and “Artificial Intelligence, Legal Change, and Separation of Powers,” published in the University of Cincinnati Law Review in 2020.
“The work is important because the use of algorithms in law and public policy has expanded dramatically in recent decades,” said Michaels, who became interested in the topic partly due to his history working as a software engineer for a medical records company before law school.
“I became involved because I felt that some other legal scholars writing in the area were not being sufficiently skeptical of the use of artificial intelligence to replace human legal decision-making, and were overlooking some of the potential risks involved. This project will focus on accountability, as standards of accountability for algorithms reflecting current legal obligations and societal concerns have lagged far behind their extensive use and influence.”
The project comes at a pivotal time, as the use of algorithms in public policy only continues to expand. Algorithms play a key role in informing policymakers when it comes to criminal justice decisions, public resources allocation, public education decisions and in some instances, national defense strategy. Despite the growing role of algorithms, accountability has not necessarily kept up with the progress. The CRASA project will seek to fill that void by leveraging its multidisciplinary expertise to establish a model for accountability that can be applied across a full range of algorithms that are currently being utilized in public policy.
The project’s research strategy has five objectives, including collecting interviews with stakeholders and establishing a community advisory board, conducting a comprehensive review of legal precedents and proposals being set forth for algorithm regulation, designing an algorithm accountability benchmark, conducting behavioral experiments and developing a software scoring toolkit. The aim is for this to be used by governments, advocacy groups, and corporations for design and evaluation.
The National Science Foundation, or NSF, is an independent federal agency that was created by Congress in 1950, “to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense.”