FOX31 Denver

AI child welfare tool used in Colorado may flag parents with disabilities

Lauren Hackney feeds her 1-year-old daughter chicken and macaroni during a supervised visit at their apartment in Oakdale, Pa., on Thursday, Nov. 17, 2022. Lauren and her husband, Andrew, wonder if their daughter’s own disability may have been misunderstood in the child welfare system. The girl was recently diagnosed with a disorder that can make it challenging for her to process her sense of taste, which they now believe likely contributed to her eating issues all along. (AP Photo/Jessie Wardarski)

(AP) — An artificial intelligence tool used by child welfare agencies, including in Colorado, may flag parents with disabilities, an Associated Press investigation found.

The child welfare tool is meant to predict which children may be at risk of harm and promises to lighten the workload for caseworkers. It’s been used in Larimer and Douglas counties, at least, with a new project underway in Arapahoe County.

Through tracking the tools across the country, however, the AP found they can set families up for separation by rating their risk based on personal characteristics they cannot change or control, such as race or disability, rather than just their actions as parents.

Now, the U.S. Justice Department is investigating at least one use of the tool in Pennsylvania, determining whether the algorithm discriminates against people with disabilities or other protected groups.

Larimer County child welfare unsure of AI’s variables

The algorithm uses a number of factors to measure risk, including race, poverty rates, disability status and family size, according to the AP, which obtained the algorithms’ underlying data points.

In Larimer County, one official acknowledged that she did not know what variables were used to assess local families.

“The variables and weights used by the Larimer Decision Aide Tool are part of the code developed by Auckland and thus we do not have this level of detail,” Jill Maasch, a Larimer County Human Services spokeswoman, said in an email to the AP, referring to the algorithm’s developers.

Colorado is among other states, including Pennsylvania and California, where counties opened their data to the academic developers who build the algorithms.

Rhema Vaithianathan, a professor of health economics at New Zealand’s Auckland University of Technology, and Emily Putnam-Hornstein, a professor at the University of North Carolina at Chapel Hill’s School of Social Work, said in an email that their work is transparent and that they make their computer models public.