If you're a leader or senior manager in a fostering agency, you know the reality on the ground. A significant portion of a social worker's week is spent on documentation - writing case notes, compiling reports, and ensuring every detail is logged accurately. Consistent, high-quality documentation isn’t just a ‘nice to have’. Without it, you could fail statutory inspections like Ofsted. 

While important, this admin overload stops social workers from doing what they enjoy most - supporting foster carers, spending time with children and on reflective practice. We’ve heard countless stories of how this is impacting morale and contributing to burnout. If nothing changes, it will lead to ongoing workforce shortages, impacting the sector’s ability to support vulnerable children.

The good news is, AI is perfectly placed to tackle the admin overload in fostering and social care. AI can't do a social worker's job, but it can be a powerful assistant, taking on some of the most time-consuming tasks. For instance, transcribing meetings between social workers and foster carers, and then providing a summary of that conversation. Without the distraction of note-taking, this allows professionals to fully focus on the conversation taking place in the room, making it a more engaging experience for everyone. 

While AI note taking tools can benefit everyone, they can be particularly helpful for social workers with their own challenges at work such as visual impairments, dyslexia, neurodivergence or where English isn’t their first language. In this sense, AI can actually promote accessibility and foster a more inclusive workplace culture.

The use of AI in fostering and social care does come with certain ethical considerations. Notably, AI should never replace professional judgement. The technology is a tool to support, not to decide. That’s why it’s important that AI generated casenotes aren’t inputted directly into a case management system. A social worker must always be in the loop, reviewing, editing, and ultimately signing off on all documentation. 

Consent is also incredibly important, especially when working with vulnerable people. When it comes to consent, it’s important that everyone whose personal data comes into contact with AI knows about. In the case of AI note-takers, consent should be baked into the experience by prompting the social worker to ask for it before they can start recording. Our experience is that most people are happy to be recorded, and there’s value for them in getting that accurate and neutral account of the meeting.

Another huge consideration is of course data protection and security when it comes to AI. For example, what safeguards are in place, where is the data being stored, and is the AI being trained using vulnerable people’s data? If so, is that data anonymised? Safety should be a core design principle of any AI tool in the social care space - so it’s important to ask the right questions when it comes to privacy, security, responsibility and accuracy. 

One example of an AI tool already being used extensively in social care and fostering is Magic Notes, which has been designed specifically for frontline workers. Magic Notes transcribes meetings between frontline workers and service users, and provides a first draft of a casenote, assessment, letter, or report that the professional can then finalise. 

At Dale Care in Durham, case managers trialled Magic Notes during 208 frontline and internal meetings, over a two month pilot. Following an evaluation, it was found that practitioners were saving 10 hours per week in admin. Meanwhile the time taken to write up an assessment was reduced by 39%. Overall, these time savings were giving each member of staff 57 working days back per year. Practitioners also reported improvement in both quality and accuracy of documented conversations - in large part because they had more time for reflection and less reliance on memory and manual notes, meaning nothing is forgotten.

AI is not a "silver bullet" that will solve all the challenges facing the sector. However, it is an essential part of a wider conversation about how we support the fostering and social care workforce and improve the quality of care. For agencies to fully explore these opportunities, they need a safe space to ask difficult questions, share their experiences, and learn from one another.


Beam is a social enterprise bringing AI to welfare services. Our first tool, Magic Notes, is saving social workers thousands of hours a week on admin, freeing them from their desk and giving them more face-to-face time with the people they’re supporting. It’s the only tool built by and for social workers, with data protection and responsible AI at its core. Our team brings together experts in welfare services and experts in tech, and everything we do is in service of our mission: to give everyone access to human-centred welfare services.

Registered as a company in England & Wales No. 06717310 Registered office: Corbridge Business Centre, Tinklers Yard, Corbridge, Northumberland NE45 5SB
Log in | Powered by White Fuse