top of page

Artificial Intelligence (AI)

We’re living in a time of accelerating change - including how digital technologies are showing up in mental health care. From therapy apps to chatbots to emotional support tools, Artificial Intelligence (AI) is increasingly woven into the fabric of how we communicate, move through the world, and support ourselves.


I want to be transparent about how I engage with these tools.


I do not use AI to write clinical notes, manage confidential information, or automate any part of your care behind the scenes. Client privacy, consent, and relational trust are foundational to my practice.


I understand that AI can feel unsettling for some - especially now, when so much is shifting so quickly. Many of us carry real concerns about surveillance, environmental impacts, and the loss of human connection. These concerns are valid. I don’t believe AI can or should replace the depth of human relationship, and I approach all technologies with discernment and care. At the same time, I believe it’s possible to engage with these tools in ways that are ethical, grounded, and even nourishing - when approached with consent, transparency, and ongoing reflection.


What I use AI for:

  • Occasionally sharing AI tools that clients request to use for between-session support, such as Chat with Aiden

  • Writing and editing website content

​

What I do not use AI for:

  • Writing clinical notes or handling any confidential client information

  • Making therapeutic decisions or automating client care

  • Any use of client data

​

If you’re navigating your own relationship with AI - with concern, excitement, skepticism, and/or something else entirely - you are not alone. We can explore these questions in ways that are relational, embodied, and grounded in consent.

bottom of page