A recent study conducted by the London School of Economics has identified significant gender biases in artificial intelligence tools adopted by more than half of England’s local councils. The research reveals that Google’s AI tool “Gemma” consistently downplays the severity of women’s health issues compared to men’s when summarizing identical social care case notes, raising concerns that such biases may translate into less care for female service users. Analysis of 29,616 pairs of AI-generated summaries showed that women’s needs were frequently omitted or described in less serious terms—a disparity not found in Meta’s rival model, Llama 3. The findings prompt calls for stricter transparency and regulatory oversight of AI systems in public services, highlighting the urgent need for algorithmic fairness as local authorities increasingly lean on technology to manage workloads.





























