Artificial intelligence (AI) seems to have exploded on the scene almost overnight, turning concepts that once belonged to science fiction into a reality. The buzz around AI is well-founded, particularly because of the speed with which it is inserting itself into our daily lives. The truth, however, is that AI is not a new tool. Whilst it might seem like a recent breakthrough, it has actually existed for decades but we have only recently produced enough computer power to make it do what we want. As a result, AI is being rolled out everywhere, automating repetitive tasks and seamlessly slotting into our households through everyday tools like Alexa, Siri or ChatGPT.
This is revolutionising the way we live and work, touching every human on the planet and every sector, including social care. In this article, we will break down some implications and look at what AI is, where we can find it, explore how it could be used in the foster care sector, the benefits it offers and address the challenges which need to be considered.
What is AI?
What sets humans apart as a species is our cognitive abilities – to think, solve problems and make decisions. Artificial intelligence is a machine’s ability to mimic thinking and acting just like a human. It lets machines process huge amounts of data, recognise patterns and predict outcomes in a fraction of the time that it would take us. What’s really clever about this is the more we use AI the more data it collects, which allows it to fine-tune these predictions and become much better at it. This is called machine learning and, as a result, AI is continuously improving its problem-solving capabilities and decision-making processes.
Where can we find AI?
AI itself isn’t something we take from a shelf – it’s been integrated into apps, websites and everyday household items we might use. These can be found in all sorts of places, like:
Satnav: uses AI to provide real-time traffic updates, suggests and keeps updating the best routes, and estimates arrival time.
Smart home devices: smart meters, thermostats and doorbells use AI to enhance energy efficiency and security, and connect to mobile devices so we can control it remotely.
Social media: AI algorithms decide what content you see more of. It figures out our habits and preferences based on what we spend more time looking at and interacting with.
Online shopping: recommendations on what we might want to buy are shown to us based on what we have purchased before and what we have been searching for online.
Text based applications (also known as natural language processing models): like ChatGPT, which generates emails, essays, and code in human-like language.
We already rely on AI in many areas of our lives and within the organisations we work, often without realising it. New apps are being developed and published every day, bringing new ideas and different functionalities to life.
What do these new advances in AI mean for the foster care sector?
Every sector is weighing up the impact of AI and fostering is no different. There are many different applications for use and some of the most technologically advanced organisations in the world are only just starting to realise – and harness – its full potential. The UK Government has taken a “pro-innovation” approach to AI, which means that it is not planning any new legislation or regulation around its use.
It is important to take a balanced view and, as with any emerging technology, there are pros and cons to consider. Please note that whilst all the functionalities discussed are available it does not mean that they have been, or have plans to be, implemented by fostering agencies. Staying informed about how AI may benefit you in your role, and what to watch out for enables you to adapt to changes more easily and protect children and young people in care.
Budgets within the care sector, as we know, are strained and there are opportunities for AI to streamline operations. It also allows us to provide a more effective service and better support network over large geographical areas. This does not mean replacing humans with machines. By taking on time-consuming, repetitive tasks it can introduce efficiency and accessibility into a system clogged with lengthy processes. Real world, practical examples where AI could be used include:
Recruitment: AI can be used to reach potential foster carers more easily. By understanding internet search behaviours, targeted advertising can reach ideal candidates and give prospective carers the information they need when they need it. It can’t fact check information in CVs (yet!), but it can accelerate the screening process by making decisions on which applicants are suitable by assessing their competencies, attributes, behaviours and experiences. The best candidates can then be put forward for consideration , rather than manually reviewing each application.
Data analysis: AI gives us deeper insights into large amounts of data, which enables solutions, interventions and improvements to be put in place quicker. Patterns it could identify include:
- Where people are more likely to drop out during recruitment and assessment
- Common signs that lead up to placement breakdown
- Potential burnout in foster carers and other staff
This analysis can help with recruitment, retention, and provide alerts to fostering agencies before challenges escalate. With an established data set, reports can be pulled quickly and easily, and strategies implemented to address these pain points.
Matching foster carers with young people: with the right algorithm, data driven recommendations can pair foster carers with children to try and reduce placement breakdowns and ensure the best possible matches for both parties.
Supporting foster carers 24/7: Chatbots, for example, enable front line staff to access resources faster, providing real-time support and information which can be especially important in critical situations. AI driven virtual assistants are available 24/7, ensuring foster carers and social workers have assistance whenever they need it. Linking this to an agency handbook, online advice and other agreed sources will mean responses can be tailored and developed.
Tailored training: machine learning can adapt to the way foster carers and staff learn which enables them to access lessons and materials more effectively. This personalised approach makes it more effective and engaging, and areas learners struggle with can be addressed more quickly. This also allows training programmes to be delivered across wide geographies in a cost effective, time efficient manner. Virtual reality is also becoming popular, allowing us to connect more deeply with the topic through experiential learning.
Administrative burdens: AI can support time-consuming tasks by automating simple processes. Data input, for example of placement requests, can be made easier, for example, through dedicated apps.
Identifying children at risk: tools can be designed to identify and monitor children and young people who are considered at risk before they come into the care system, reducing the harm that they experience and supporting decision-making about whether to remove them from their home. Such initiatives may be driven nationally by local authorities rather than by fostering agencies, but should be on our radar as a possible future use.
Should we be worried about AI?
AI undeniably possesses remarkable capabilities. It can process information faster than a human can and excels at specific tasks with astonishing precision. However, there are also many things it cannot do. Most notably, it is not human! We have created artificial intelligence not artificial empathy, so it cannot replace the human approach which must sit at the centre of the children’s care sector. Making decisions on logic alone when vulnerable children and young people are involved would not guarantee best outcomes. We must not lose sight of insights and understanding through this process.
These AI applications will need a great deal of consideration before implementation in the care sector, particularly around mitigating these challenges:
Bias: a major criticism of AI is that it has biases. It was created and trained by humans and as a result retains the social prejudices it has been exposed to. For example, statistics show that black children are disproportionately represented in the care system compared to other ethnicities. AI systems may interpret this aspect as a risk factor and make decisions based on these statistics, but we know this does not accurately represent each individual child’s situation. When we apply this logic to recruitment it is not a stretch to see that certain characteristics could be unfairly screened out, which means we lose brilliant potential candidates based on incorrect keywords in applications or biased judgement.
Please note, AI operates on logic so we need to recognise that it removes many biases too. In fact, it may make better decisions than humans because we each come programmed with our own set of experiences, biases and emotions. Whilst this is something to be wary of, those developing AI are fully aware of this limitation and are working to remove it. As machine learning continues, the industry is hopeful that these biases will be removed further, but anybody using AI decision-making tools would be wise to audit the results to confirm accuracy.
Data privacy: ensuring that the child or young person’s personal data is securely handled at all times presents a challenge. AI is invasive: for it to work it needs vast amounts of data, and we will not want organisations to use this without informed consent. The collection and use of data is not always transparent, and implementation within the sector must be underpinned by strong data security protocols, ensuring it is only used for the intended purposes and stored securely. Foster carers, in particular, need to be aware of the everyday items in their household that rely on AI to work and ensure robust practices are in place so that it can access as few personal details as possible.
The personal touch: when members of staff feel valued it increases their physical and mental health, engagement and satisfaction levels, and ultimately their retention in the sector. However, running some processes through AI could lose some of the personal touch that exists today. Fostering is a people sector and will inevitably rely on people, not AI. Sometimes foster carers just need somebody to talk to, bouncing ideas and gaining reassurance. If a chatbot is not able to provide this it adds more frustration to an already frustrated individual and may feel demoralising. Equally, as our experiences during the pandemic showed us, the benefits of tailored remote training will need to weighed up against the learning gained in a face to face group setting. It is essential we do not lose the opportunity to interact with peers.
Ethical concerns: it is sensible to wonder whether some of the decisions AI makes will cross ethical boundaries as we consider implementations or moves to trial phases. Is it purely going to focus on managing costs and operational efficiency, forgetting about the vulnerable humans impacted by it all?
What does AI mean for children and young people in care?
An understandable cause for concern for social workers, foster carers and fostering agencies is the danger AI poses to children and young people. It is essential that they learn how to use AI safely. Young people are digital natives and it can feel like adults are being left behind. Gaining an understanding of the different tools available, how AI works, and what to watch out for is essential in supporting learning and development for children and young people whilst keeping them safe. It is important to recognise AI also brings opportunities for exciting new career paths and increases accessibility to services that were once unavailable, like immediate mental health support.
A report published by UNICEF and the World Economic Forum cites risks ranging from privacy and online safety to psychological and behavioural effects. There is much misinformation online and an over-relyiance on technology can impair critical thinking and adaptability.
We also have to consider not just children and young people’s use of AI, but the use of AI in informing decisions made about their lives. When they are given a voice, particularly during periods of upheaval, this can positively impact on their feeling of wellbeing, resilience and adaptability. A criticism by many care leavers is that decisions were made for them with little to no communication. We do not want AI, which removes human elements, to take this away further and every consideration must be made to assess how we ensure this is a person-led process supported by technology. And not the other way around!
AI is here to stay, and will continue to augment our lives in ways we cannot yet understand. Digital literacy is essential for everyone working in the fostering sector, learning about the benefits and the dangers should it not be used correctly. Foster carers and staff will find internal guidelines helpful, with advice on which apps and tools are safer to use than others, how to keep personal details safe online and how to keep children and young people safe.
Whilst AI can streamline many processes, there is a lot it cannot do. A hybrid approach that speeds up inefficiencies, without losing the human element, seems to be the best approach, and any implementation should be rigorously monitored to look out for biases in decision-making.
IFAs should include AI in digital transformation strategies, ensuring robust policies are in place to mitigate challenges particularly around data being stored and processed securely. We can look to industries like healthcare, which places a high onus on patient confidentiality, and is undergoing an incredibly fast-paced transformation with AI and other technologies.