• chevron_right

      DOJ probes AI tool that’s allegedly biased against families with disabilities

      news.movim.eu / ArsTechnica · Tuesday, 31 January, 2023 - 19:50 · 1 minute

    DOJ probes AI tool that’s allegedly biased against families with disabilities

    Enlarge (credit: d3sign | Moment )

    Since 2016, social workers in a Pennsylvania county have relied on an algorithm to help them determine which child welfare calls warrant further investigation. Now, the Justice Department is reportedly scrutinizing the controversial family-screening tool over concerns that using the algorithm may be violating the Americans with Disabilities Act by allegedly discriminating against families with disabilities, The Associated Press reported , including families with mental health issues.

    Three anonymous sources broke their confidentiality agreements with the Justice Department, confirming to AP that civil rights attorneys have been fielding complaints since last fall and have grown increasingly concerned about alleged biases built into the Allegheny County Family Screening Tool . While the full scope of the Justice Department’s alleged scrutiny is currently unknown, the Civil Rights Division is seemingly interested in learning more about how using the data-driven tool could potentially be hardening historical systemic biases against people with disabilities.

    The county describes its predictive risk modeling tool as a preferred resource to reduce human error for social workers benefiting from the algorithm’s rapid analysis of “hundreds of data elements for each person involved in an allegation of child maltreatment.” That includes “data points tied to disabilities in children, parents, and other members of local households,” Allegheny County told AP. Those data points contribute to an overall risk score that helps determine if a child should be removed from their home.

    Read 11 remaining paragraphs | Comments