The Markup: The NYC Algorithm Deciding Which Families Are Under Watch for Child Abuse

Reporting in The Markup reveals that New York City’s Administration for Children’s Services (ACS) has been using an algorithm to flag families for heightened scrutiny, without informing parents, attorneys, or even caseworkers. The tool, which weighs factors like neighborhood, mental health history, and number of children, raises serious concerns about racial and socioeconomic bias. Nila Natarajan, Associate Director of Policy & Family Defense, confirmed that this system has never been disclosed in discovery, stating: “We get discovery in individual investigations and prosecutions when we represent parents, and nothing in the discovery we get suggests or implies, and certainly doesn’t lay out, use of any of these tools.”

--

"Some lawmakers and advocates have said the disclosures agencies have to make for AI tools don’t go far enough, and are pushing for more transparency and oversight. Councilmember Jennifer Gutiérrez has proposed a bill that would create external teams for overseeing the implementation of automated tools.

“These tools aren’t neutral—they can reinforce systemic biases and fundamentally affect people’s lives for decades,” Gutiérrez said in an emailed statement. “The stakes are just too high to not be prioritizing oversight.”

The NYCLU supports state-level legislation called the Digital Fairness Act, which would create both new data privacy and civil rights protections. Under the act, government agencies in the state would be required to undergo a civil rights audit before deploying automated decision-making tools. People who had an automated tool used on them would be informed and allowed to contest the decision."

Read the full piece here.

Latest News