Join our mailing list to get regular email updates and info on what we're up to!
If you are under 18, please make sure you have your parents’ permission before providing us with any personal details.
At Brook, we’re excited about the transformative potential of AI in the delivery of clinical services, as well as the provision of information and advice. In this blog, Laura Hamzic, our Director of Digital and Comms, outlines our ambitious plans to trial AI in our work, while being cognisant of the risks and complexities.
For me, the word of the year for 2023 has to be AI.
Discussions of artificial intelligence are everywhere we look – from the music industry and gaming, to banking, manufacturing and healthcare. In fact, the interruption to some of our favourite Netflix shows in 2023 has been down to a strike by screenwriters, protesting about the use of AI. The charity sector has been no different, with the 2023 Charity Digital Skills Report reporting ‘an explosion of interest’ in AI.
I think that one of the best things we can do when faced with something new and unknown is share and learn from one another as we go. This sentiment was behind our recent online event Safeguarding in the age of AI, where we brought together sector experts and digital leaders to discuss and explore both the potential and the risks.
Artificial Intelligence is a broad term for technologies that enable machines to simulate human intelligence or perform tasks that would typically require human intervention. In the context of Brook, AI isn’t just another tech buzzword, we see it as a key enabler for our mission.
Our goal is to use AI in two ways:
We are currently observing a number of worrying trends which point to a reduced use of contraception. First, abortion rates are higher than ever. In some parts of the country, the first two months of 2023 saw a 134% increase compared with the same period in 2022. Second, the under 18 conception rate is rising for the first time in 14 years. And third, some sexually transmitted infections are at the highest rates since records began.
We can never conclusively pin down the cause of those things to one thing alone, but we know that thanks to spending cuts there are problems when it comes to accessing contraception. This hasn’t been helped by a 42% real terms reduction in contraceptive spending since 2015/16.
Brook has, what we call, a ‘Digital Front Door’ to our clinical services. This allows users to order home STI test kits, self-managed treatment for STIs and request contraception. A request for contraception is in essence a very thorough questionnaire with the essential questions we need to ask to create a client record and assess their risk profile to receive support remotely.
Working with AI company EBO, our goal is to evolve this approach from a form to a simulated conversation. This conversation will allow us to gather the details necessary to assess the risk profile of the user and the suitability of their preferred method. The data captured will then be passed to clinical teams to process the requests. This might result in either the provision of self-administered methods such as the pill or the contraceptive patch by post or provide links to appointment booking for clinician administered methods.
Another challenge we have observed both through our frontline staff and our social profiles is a growing resistance to the use of hormonal contraception. We know that people’s decisions about contraception are being shaped by influencers eschewing hormones and talking about apps that help you to track when in your menstrual cycle it is safe to have unprotected sex. Unsurprisingly, there are lots of potential pitfalls when it comes to that method.
Our goal is to use AI-powered solutions to improve knowledge, address misconceptions and allay fears.
Through Machine Learning (ML) and Natural Language Processing (NLP), we will use AI to ingest our own website content to ‘learn’ about topics and provide information in a way that emulates a human conversation.
ML is a subset of AI that enables machines to learn from data without being explicitly programmed to perform specific tasks. It’s the underlying technology that allows our AI systems to ‘learn’ from the content they ingest, progressively improving their performance and making our services more effective and tailored.
And thanks to the use of NLP, conversations would replicate human empathy, as well as ensuring context.
Brook plans to launch Virtual Agents to extend the service it provides to users. The Virtual Agents will not only be deployed on the Brook website but also via Instagram and TikTok. That way, we can tackle misinformation at its source, and be in the spaces that young people inhabit and that shape their decision making.
The nature of Machine Learning algorithms is that we can continually refine these Virtual Agents, making them more responsive and adaptive to our users’ needs. This adds a layer of intelligence that keeps our services in step with emerging information in sexual health. It also will help us spot new or emerging issues by studying the things our Virtual Agents can’t respond to.
Going forward, Brook is committed to taking an approach of ‘Open Intelligence’, capturing the essence of an open-source approach to share what we learn. In digital terms, ‘open source’ means making something you’ve created freely available, so that others can use it and modify it.
We believe in the value of transparency in AI development, especially for sensitive topics like sexual health.
We also want to ensure that our AI project is socially responsible and compliant with existing and any emerging guidelines. Our hope is that this approach translates into broader stakeholder trust and we hope that our openness will encourage wider collaboration.
We recognise that charities are at a disadvantage when it comes to AI adoption. And this is evidenced in the Charity Digital Skills 2023’s report which reveals that only 1 in 4 charities (27%) are using AI. Chronicling our journey should also enable other charities to benefit from our lessons, bypass early pitfalls and accelerate their own AI adoption.
Cost, fear and general firefighting are likely to be barriers.
Our hope is that by open sourcing our own AI journey, we will help in the democratisation of AI technology. AI is here to stay and the sooner we can harness its potential to solve real-world problems, the faster it can scale and the greater the social impact it can have.
In the following video, Dr Gege Gatt, the founder of EBO shares his own fascinating insights on how both our charitable values, and those things that make us human – empathy and creativity – must guide our use of AI.
If you have any questions, or want to collaborate, contact the Digital Team on website@brook.org.uk
There’s less than a month to go until Brook hosts the Big Period Lesson Live with our partner Lil-Lets. With 450 schools already signed up, it looks set to be…
In July this year Brook welcomed Kelly Harris, Business Development Lead / Arweinydd Datblygu Busnes, to lead the expansion of our education services in Wales. Here Kelly shares details of the official launch event.
This Safeguarding Adults Week we want to raise awareness of the importance of safeguarding, how intrinsic it is at Brook and how well-equipped all our staff are to respond to…