We are pleased to present the second in our new parent education series - Little Talks, Big Impact - designed to support meaningful connections at home. Each edition provides gentle conversation prompts you can explore with your daughter: at the dinner table, on the way to school, or during a walk together. These prompts are grounded in the core themes of our College’s well-being framework: being, becoming and belonging.
It is in the interest of the makers of AI that we keep clicking.
Recently, I caught up with two young people in their early twenties. Both had degrees, had excellent jobs and a good network of friends. Both admitted to me that, given some recent emotional distress, they had turned to AI for some insight into their own states of mind and for some advice on how to proceed with a difficult relationship. My conversation with these two young people was mirrored by our Pamela Nutt Address speaker, Associate Professor Sarah Irving-Stonebraker, who told us of young people who are increasingly reporting that they feel “disconnected”.The advantage of reaching out to ChatGPT or the like is that the bot is constantly available, night or day, and provides instant, positive, affirming feedback at a moment of psychological or emotional distress. Dr Burgis recently met with NSW Police, whose statistics tell us that 72% of young people have used some sort of AI companion. In an uncanny way, the 2013 film Her - where a man falls in love with a chatbot - predicted the strange situation in which we find ourselves in 2025.
And yet, if the smartphone revolution has taught us anything, it is this: what is supposed to connect us can, in the end, disconnect us. The famous line from The President’s Men is a useful one here: “follow the money.” The imperative for smartphone or AI companies is an economic one; it is in the interest of the makers of AI that we keep clicking. A person feeling disconnected from other humans will be more likely to keep clicking, driven by a feeling of disconnection, and an AI bot will be more likely to provide answers that increase dependency rather than decrease dependency. In the end, then, that 24/7 well-being advice may in the end be a ruse. As reported recently in The Guardian, Psychologist Sahra O’Doherty says that AI chatbots are not designed to bring therapeutic benefit, but are designed to “mirror” our wants and needs: “What it is going to do is take you further down the rabbit hole, and that becomes incredibly dangerous when the person is already at risk and then seeking support from an AI.”
In a recent interview, Jonathan Haidt, author of The Anxious Generation, discussed the impact of social media on young girls, referencing insights from Sarah Wynn-Williams’ book Careless People. A former executive at Meta, Wynn-Williams now shares her concerns about the deep, and often invisible, influence of social media companies on the self-worth of young users.
One alarming insight: if a girl deletes a number of selfies from Instagram, the algorithm detects a pattern of self-consciousness or self-loathing. Instead of offering supportive or affirming content, the algorithm responds by promoting beauty products. The implicit message becomes: “You’re not beautiful. Let us help you become beautiful.”
This troubling cycle, as Naomi Wolf observed decades ago, commercialises a girl’s pain rather than addressing it as a social concern. Her vulnerability becomes a source of profit.
We are already seeing this play out locally. Mrs Watters (Head of Junior School) recently wrote about children as young as 10 asking for beauty products. The founder of skincare company Go-To has also commented on this phenomenon, highlighting a booming market driven by fear rather than self-expression.
Bringing to mind the fact that a teenager’s brain is far more elastic than an adult brain and also has an immature prefrontal cortex (which is the part of the brain that processes risk and consequence), our teenagers are at risk of replacing real human connection with artificial connection, thereby amplifying the disconnection that caused pain in the first place. AI add-ons have been a part of social media for some time now. In Snapchat, it is called “MyAI”. TikTok has “Tako” or “Genie”. Instagram has an AI chatbot built into its DMs. ChatGPT is illegal for anyone under the age of 13, but this also means that many of our teenagers could have the app installed on a phone. As adults come to terms with what AI might mean for our world, teenagers and young people are taking up its use in real time.
It is tempting to think that one way around this might be to limit distress in general, so that our girls have no need to reach for a bot for friendship and support. This will prove impossible. While we cannot (and should not) construct worlds for our children free of psychological distress or difficulty, we can teach them what to do when these big emotions inevitably surface. We can show them that all humans experience distress and difficulty, and then demonstrate to them wise ways of responding. In teaching and mentoring, we will establish the human connection that drives “friendship” with AI in the first place.
Here are some ideas for navigating this new territory of AI and growing daughters:
For further support or information, please contact the Senior School Well-being Team via Ms Liz D’Arbon: edarbon@plc.nsw.edu.au
Sarah has also taught in both government and independent schools, as well as across co-ed and both single sex schools i.e. girls schools and boys schools.