Little Talks, Big Impact: AI and Well-Being

We are pleased to present the second in our new parent education series - Little Talks, Big Impact - designed to support meaningful connections at home. Each edition provides gentle conversation prompts you can explore with your daughter: at the dinner table, on the way to school, or during a walk together. These prompts are grounded in the core themes of our College’s well-being framework: being, becoming and belonging.

It is in the interest of the makers of AI that we keep clicking.

Recently, I caught up with two young people in their early twenties. Both had degrees, had excellent jobs and a good network of friends. Both admitted to me that, given some recent emotional distress, they had turned to AI for some insight into their own states of mind and for some advice on how to proceed with a difficult relationship. My conversation with these two young people was mirrored by our Pamela Nutt Address speaker, Associate Professor Sarah Irving-Stonebraker, who told us of young people who are increasingly reporting that they feel “disconnected”.The advantage of reaching out to ChatGPT or the like is that the bot is constantly available, night or day, and provides instant, positive, affirming feedback at a moment of psychological or emotional distress. Dr Burgis recently met with NSW Police, whose statistics tell us that 72% of young people have used some sort of AI companion. In an uncanny way, the 2013 film Her - where a man falls in love with a chatbot - predicted the strange situation in which we find ourselves in 2025.

And yet, if the smartphone revolution has taught us anything, it is this: what is supposed to connect us can, in the end, disconnect us. The famous line from The President’s Men is a useful one here: “follow the money.” The imperative for smartphone or AI companies is an economic one; it is in the interest of the makers of AI that we keep clicking. A person feeling disconnected from other humans will be more likely to keep clicking, driven by a feeling of disconnection, and an AI bot will be more likely to provide answers that increase dependency rather than decrease dependency. In the end, then, that 24/7 well-being advice may in the end be a ruse. As reported recently in The Guardian, Psychologist Sahra O’Doherty says that AI chatbots are not designed to bring therapeutic benefit, but are designed to “mirror” our wants and needs: “What it is going to do is take you further down the rabbit hole, and that becomes incredibly dangerous when the person is already at risk and then seeking support from an AI.”

Social media and girls

In a recent interview, Jonathan Haidt, author of The Anxious Generation, discussed the impact of social media on young girls, referencing insights from Sarah Wynn-Williams’ book Careless People. A former executive at Meta, Wynn-Williams now shares her concerns about the deep, and often invisible, influence of social media companies on the self-worth of young users.

One alarming insight: if a girl deletes a number of selfies from Instagram, the algorithm detects a pattern of self-consciousness or self-loathing. Instead of offering supportive or affirming content, the algorithm responds by promoting beauty products. The implicit message becomes: “You’re not beautiful. Let us help you become beautiful.”

This troubling cycle, as Naomi Wolf observed decades ago, commercialises a girl’s pain rather than addressing it as a social concern. Her vulnerability becomes a source of profit.

We are already seeing this play out locally. Mrs Watters (Head of Junior School) recently wrote about children as young as 10 asking for beauty products. The founder of skincare company Go-To has also commented on this phenomenon, highlighting a booming market driven by fear rather than self-expression.

What does this mean for the young people of today?

Bringing to mind the fact that a teenager’s brain is far more elastic than an adult brain and also has an immature prefrontal cortex (which is the part of the brain that processes risk and consequence), our teenagers are at risk of replacing real human connection with artificial connection, thereby amplifying the disconnection that caused pain in the first place. AI add-ons have been a part of social media for some time now. In Snapchat, it is called “MyAI”. TikTok has “Tako” or “Genie”. Instagram has an AI chatbot built into its DMs. ChatGPT is illegal for anyone under the age of 13, but this also means that many of our teenagers could have the app installed on a phone. As adults come to terms with what AI might mean for our world, teenagers and young people are taking up its use in real time.

It is tempting to think that one way around this might be to limit distress in general, so that our girls have no need to reach for a bot for friendship and support. This will prove impossible. While we cannot (and should not) construct worlds for our children free of psychological distress or difficulty, we can teach them what to do when these big emotions inevitably surface. We can show them that all humans experience distress and difficulty, and then demonstrate to them wise ways of responding. In teaching and mentoring, we will establish the human connection that drives “friendship” with AI in the first place.

Here are some ideas for navigating this new territory of AI and growing daughters:

  • Keep conversations with your daughter open. Use the car ride home - or what I have called the “teenage pram” - to chat with your daughter about her day. Teenagers habitually find the communal gaze straight ahead and the movement of the car as a sign that it is safe to talk. Let her lead, and offer her gentle questions to prompt her to keep talking. 
  • Explain to your daughter that she should always come to you or to our Well-Being staff at school if she is having trouble handling those “big emotions”. We will work together to give her language and frameworks for those feelings. 
  • Use light-hearted humour to show your daughter what to do with big emotions. Avoid sarcasm, but show her how humour helps to get a handle on emotions with no name. The films Inside Out and Inside Out 2 are fabulous on this point. 
  • Check your daughter’s devices for AI bots. If you find access to them, talk to her about AI - what it is, what it isn’t - and the limitations of non-human connection. 
  • Remind her of resources like Kids Helpline (1800 551 800) and Beyond Blue (1300 224 636), and explain how they differ from AI bots. Please refer to page 23 of the Senior Student Handbook and page 11 of the Junior Student Handbook for more suggestions on Who Can Help?
  • If necessary, arrange for your daughter to see an external psychologist to ensure she has human connection in this crucial time in her life. 
  • Educate yourself on some of the more alarming “connections” with AI companions, so that you are aware of the invitations made to young people. The eSafety Commissioner has put together a 45-minute webinar, “Understanding AI Companions: What parents and carers need to know”, on Thursday, 28 August. You can register here
  • If you see anything alarming, please reach out to us or directly to the eSafety Commissioner.  

For further support or information, please contact the Senior School Well-being Team via Ms Liz D’Arbon: edarbon@plc.nsw.edu.au 

Dr Sarah Golsby-Smith

Head of Learning and Teaching at PLC Sydney

Sarah has also taught in both government and independent schools, as well as across co-ed and both single sex schools i.e. girls schools and boys schools.