Join our mailing list to get regular email updates and info on what we're up to!
If you are under 18, please make sure you have your parents’ permission before providing us with any personal details.
In this blog, Education and Wellbeing Specialists Katie Cicco and Juliette Graham explore how AI can help young people learn about sex and relationships and explain the key risks parents, carers and professionals need to understand in order to support them to use these tools safely.
Artificial intelligence (AI) tools like ChatGPT are quickly becoming part of everyday life. For many young people it can feel easier—and far less embarrassing—to type a question into a chatbot than to ask a teacher, parent or healthcare professional about sex and relationships.
That reaction is understandable. Talking about sexual health can feel awkward or even shameful. AI chatbots can seem like a private, non-judgemental space. They can also be very convenient: rather than searching through long articles or whole websites, a chatbot can produce a quick, tailored answer.
For parents, carers and anyone who works with young people, this is an important moment. AI can be a useful starting point for learning about sex and relationships, but it is not a substitute for reliable sexual health education or trusted advice. Supporting young people to use AI wisely means understanding both the benefits and the risks.
ChatGPT and similar tools do not have personal experiences or opinions. They generate responses by spotting patterns in huge amounts of text from books, websites and other written material. This means:
• It is not giving “the truth”, but its best guess at what a good answer might look like.• It does not know the young person asking the question, even if personal details are shared.• It can give information that is inaccurate, out of date or misleading.
AI responses can also include what researchers call hallucinations—answers that sound convincing but are simply wrong. Because AI works like very advanced predictive text, it cannot tell when its own output is incorrect.
AI is trained on vast data sets drawn from the internet. These include reliable sources but also biased or misleading material. On topics where opinions differ, such as sexual health or gender, a chatbot may reproduce stereotypes or stigmatising language.
Young people from particular backgrounds—for example LGBTQ+ communities or minoritised ethnic groups—may find that AI answers are not inclusive or do not reflect their experiences.
When you speak to a doctor or nurse, strict rules protect your personal information. AI chatbots are different. They do not currently have the same level of data protection and anything typed into a chatbot may be stored or analysed by third parties.
This raises concerns highlighted by the World Health Organisation: in some countries where sexual and reproductive rights are restricted, information shared with an AI service could, in theory, be used to identify or even prosecute someone.
ChatGPT can help a young person look up definitions—for example “What does STI stand for?” or “What does consent mean?”—but it cannot replace talking to a trusted adult or a qualified health professional.
• AI cannot judge whether a situation is safe or whether consent is fully present.• It cannot give personalised advice about contraception, pregnancy or STIs.• It cannot sense emotions or provide the reassurance that comes from human interaction.
Sexual health professionals, such as those at Brook clinics or through NHS services, can provide accurate information and a confidential, supportive environment—something no chatbot can offer.
Parents, carers and professionals can guide young people to:
• Use AI as a first step, not the final word.• Double check any information with reliable sources such as Brook or the NHS.• Avoid sharing personal or identifying details when using chatbots.• Seek confidential advice from a healthcare professional or sexual health clinic for anything personal or urgent.
AI can be a helpful tool for quick facts or to spark curiosity, but sexual health and relationships are too important to leave entirely in the hands of a chatbot. By talking openly and signposting to trusted services, parents and professionals can help young people use AI with confidence while staying safe, informed and protected.
Want brook to teach in your school?
Brook’s Life Online sessions are a series of lessons delivered by Brook Specialists to support pupils to build digital literacy, protect their wellbeing, and make informed choices about sexual health and relationships.