The use of AI is increasing in a range of services that that encounter clients/ customers who are experiencing or perpetrating domestic, family and/ or sexual violence.  Such services may use the technologies to provide cost-effective round-the-clock responses to individuals and families who cannot attend in-person support, and who may be in living with violence.  It is acknowledged that across the digital social services sector more broadly there have been many examples of unintended negative impacts, thereby causing disproportionate harm through secondary system abuse to service users (such as by Robo-debt in the social security sector)

Precisely what systems are being used, in which services, and how they are being used is not broadly known.  This is particularly problematic given the harm and trauma already experienced by domestic violence survivors. As a first step towards developing guidelines and strategies for best practice, this project aims to build a map of what and how AI systems are currently being used across these services.

Research team: Lyndal Sleep, Heather Lovatt, Paul Henman (UQ) and Amy-Louise Byrne.