Signs of Racial Bias in State’s Algorithm Sets Off Alarms on Child Welfare Probes
We all have watched and applauded the tremendous growth of the use of AI in our society. But sadly it turns out AI isn’t always the answer to every problem. A case in point is a story from npr.org.
In Oregon, child welfare officials will stop using an algorithm to help decide which families are investigated by social workers, opting instead for a new process that officials say will make better, more racially equitable decisions.
The move comes weeks after an Associated Press review of a separate algorithmic tool in Pennsylvania that had inspired initially Oregon officials to develop their model. It identified a “disproportionate” number of Black children for “mandatory” neglect investigations as soon as it was enacted.
Sadly this is s story of unintended consequences from an algorithm that showed real signs of racial disparities. And this is on top of the original problems that required looking into in the first place.
U.S. Sen. Ron Wyden, an Oregon Democrat, said he had long been concerned about the algorithms used by his state’s child welfare system and reached out to the department again following the AP story to ask questions about racial bias.
“Making decisions about what should happen to children and families is far too important a task to give untested algorithms,” Wyden said in a statement. “I’m glad the Oregon Department of Human Services is taking the concerns I raised about racial bias seriously and is pausing the use of its screening tool.”
In recent years while under scrutiny by a crisis oversight board ordered by the governor, the state agency – currently preparing to hire its eighth new child welfare director in six years – considered three additional algorithms. Other features include predictive models of risk for death and severe injury, recommendations on foster care placement, and where. They tools weren’t added.
Wyden is the chief sponsor of a bill that seeks to establish transparency and national oversight of software, algorithms, and other automated systems.
“With the livelihoods and safety of children and families at stake, the technology used by the state must be equitable — and I will continue to watchdog,” Wyden said.
Using an algorithm to make the final judgment on matters so delicate and so impactful on a family is most definitely showing us that this is where AI is perhaps not the best tool to use in this situation. But it is a good idea to look into your own area’s use of AI that may be having a direct impact on your family.
read more at npr.org
Leave A Comment