Skip to main content

UKCW Logo

19 - 20 March 2025

NEC Birmingham

      CQC - Educational Partner        CPD Member Accreditation Logo

News & Press Releases

banner

23 Sep 2024

GUIDANCE ON USE OF AI IN CARE SECTOR

On the 17th of May 2024, frontline care workers from England, Wales and Scotland met at Reuben College, University of Oxford, for a roundtable discussion on the ‘responsible use of (generative) Artificial Intelligence (AI) in adult social care’. The month, the care sector read a number of expectations that were set out by care workers.

 

Those in attendance believe that AI will be able to help address some of the issues they are facing and support their work. But frontline care workers have set out clear expectations towards employers, regulators and policy makers to ensure that we can continue to provide highest quality care and that the responsibility around the use of AI is a shared one. 

 

Below are the expectations set out in the words of the care workers:

Expectations towards employers: “We are calling on our employers to put in place AI and technology policies and procedures that set our clearly if, when and how AI can or even should be used by their staff. We are expecting our employers to take on full responsibility for any harm that may be caused using AI if procedures and policies were followed correctly. We are expecting to receive proper training and access to continued learning so that we understand the AI technology we are expected to use, the risks of using it in the remits of our work and proper procedures to mitigate and respond to risks. There should be different levels of AI training and contact persons in the company that can support people with lower levels of training. But every member of staff should have basic awareness on AI if it is being used in the company. 

 

“AI awareness should form part of the care certificate. If we are required or expected to use AI and other technology in our work, we do want work devices rather than having to use our own mobile phones or computers. We also want to be compensated for any data or software costs that may be incurred using AI services.

 

Expectations towards developers: “We are calling on all developers of care specific AI products to ensure that these products are aligned properly with values of care and that care workers are involved meaningfully in the development of these products. We should be compensated fairly for such involvement. It is important that you are transparent about the product and its limitations. We don’t want a ‘sales pitch’ but an ethical, save and effective product. We want information about big issues like data privacy, if and how data will be stored and how it will be used and we want to know how to use it safely. 

 

Expectations towards local authorities and policy makers: “We are expecting local authorities to make responsible decisions around the way that AI products are commissioned and used for the provision of adult social care. AI should not be seen as a response to all the problems the sector is facing. For example, care workers cannot be replaced by AI but AI can support them. More money must be allocated to better train and pay care workers and this should be balanced with the investment into AI products and development. Policy makers should do their part in developing an understanding on the responsible use of generative and other types of AI in adult social care. 

 

Expectations towards the regulators of the nations: “We are expecting the regulators of care services to have AI training and policies in place within their own organisations and to clearly communicate good practice around the use of AI in care services specific to the regulations of the UK nations. We are also expecting service inspectors to be trained on the responsible use of AI and to be supportive to us on our journey towards integrating AI into our work. Now more than ever do we need supportive regulators. Some guidance for our peers before you start using generative AI at your work place We want to encourage our peers in these times of new AI products, like ChatGPT and similar products. AI can possibly do a lot of good for us and help us in our work, like reducing the administrative workload, improve the quality of care we provide, support our clients and boost thinking. 

 

Dr Jane Townson OBE, Chief Executive of the Homecare Association, said, “The Homecare Association welcomes this thoughtful statement from care workers on the responsible use of AI in adult social care. We commend the care workers for taking initiative to engage with this important issue that will increasingly impact the sector.

“Their call for clear employer policies, proper training, and shared responsibility around AI use aligns with our views. We agree that AI should support and enhance care delivery, not replace the essential human element that care workers provide.

“Care workers' expectations for developers, policymakers and regulators are sensible. We echo their call for meaningful involvement of care workers in AI product development and for balanced investment between technology and the care workforce.

“As the statement highlights, care workers face significant challenges around workload, pay and working conditions. While AI may help address some issues, sustained investment in the care workforce remains critical.

“We look forward to further dialogue on this topic and stand ready to work with care workers, employers, tech companies and policymakers to ensure AI is implemented responsibly in ways that benefit care recipients and workers alike."

The full statement and media release can be found here.

View all News & Press Releases
Loading