با برنامه Player FM !
536: Exploring AI and Mental Health with Sara Wilder
Manage episode 431826534 series 2576605
In this podcast episode of "Giant Robots On Tour," hosts Sami Birnbaum and Rémy Hannequin explore mental health in the age of artificial intelligence with Sara Wilder, a Therapeutic Life Consultant and Licensed Clinical Social Worker. Sami shares his own brief foray into psychotherapy before transitioning to tech, highlighting the relevance of mental health in today's rapidly evolving technological landscape. Sara, whose path to therapy was influenced by her personal struggles and a desire to help others, discusses her unique approach as a Therapeutic Life Consultant, which blends traditional therapy with direct coaching and consulting.
Sara elaborates on her journey and how the COVID-19 pandemic pushed her towards integrating technology into her practice. She transitioned from in-person sessions to virtual consultations, emphasizing the impact of this shift on mental health and brain function. Sara's interest in AI stemmed from her need to scale her business and her desire to use technology to aid her clients. She discusses her experience with AI tools like ChatGPT, both the benefits and challenges, such as generating relatable content and addressing AI "hallucinations." Sara highlights the importance of using AI ethically and maintaining human oversight to ensure the authenticity and accuracy of AI-generated outputs.
The conversation also delves into broader concerns about the impact of AI and technology on mental health. Sami and Rémy discuss the addictive nature of technology and its parallels with substance addiction, emphasizing the need for self-imposed boundaries and emotional intelligence. Sara shares insights into how AI can be a valuable tool in therapy, such as using AI for social anxiety role-playing or to generate conversation prompts. The episode concludes with a discussion on the balance between leveraging AI for efficiency and maintaining human connection, stressing the need for ongoing education and ethical considerations in AI development and deployment.
- Follow Sara Wilder on LinkedIn. Visit her website: sarawilderlcsw.com.
- Follow thoughtbot on X or LinkedIn.
Transcript:
SAMI: Yes, and we are back. And this is the Giant Robots Smashing Into Other Giant Robots podcast, the Giant Robots on Tour series coming to you from Europe, West Asia, and Africa, where we explore the design, development, and business of great products. I'm your host, Sami Birnbaum.
RÉMY: And I'm your other host, Rémy Hannequin.
SAMI: Okay, if you're wondering where Jared is, we finally got rid of him. No, that's a joke, Jared, if you're listening. He was my previous co-host. You can go back to our other podcasts. But we've got Rémy on board today. And you could take a look at our previous podcast, where we introduce the Giant Robots on Tour series, where you'll find out about all the different co-hosts. And you can learn more about Rémy's sourdough bread.
Joining us today is Sara Wilder, a Therapeutic Life Consultant, Licensed Clinical Social Worker, and Clinical Addictions Specialist.
Okay, Sara, this is going to sound a little bit strange, but, actually, once upon a time in my own life, I kind of wanted to be you, not exactly you because that would be even more strange.
SARA: [chuckles]
SAMI: But before I got into coding and tech, I was interested in psychotherapy. And I started a course and, for different reasons, it didn't work out, and I never pursued that career. But what's really interested us about you is the work and research you're doing around mental health in this new world of AI, artificial intelligence. You have a really interesting talk coming up at the CreativeVerse Conference in North Carolina. And we actually have Fatima from thoughtbot who's going to be presenting at the same conference.
And you're specifically talking about prioritizing mental health in the age of AI. And there is so much we want to ask you about this. But before we do, I always like to go back to the start with my guests. Everyone has a story, and I'm interested in your journey. What led you into the world of therapy?
SARA: Well, to unpack that, it's, like, probably way too long for this podcast, but in a nutshell, I had no idea what...I did not want to be a therapist when I grew up, so thank you for wanting that more than me. But I landed here, I think, partly just because of, you know, I always wanted to help people. I never really knew what that was going to look like. I thought it maybe was going into nursing or more of the medical side. But really what landed me here and made me stay here and really choose to stay in my profession...because, at one point, I was like, no, I'm not sure I could do this for the rest of my life; this is a lot. But it was really my own suffering.
I had to take a really hard look at where I came from, what I had gone through, and why I wanted to just, you know, like, help people, but then try to keep changing how I did that. And I'm glad I chose to stay put in this kind of therapeutic, you know, life. Therapeutic life consultant is a term that I kind of formulated myself because I'm not quite a traditional therapist anymore. I'm not sitting in an office with the couch. We talk a lot about our relationship with our mothers.
But I have more of a personality that's direct and kind of coaching. And I want to go more into consulting and help people understand how to do their own healing work using my clinical background of being in diagnostics in different hospital settings, stuff like that. And because I had to do my own work, and I had to understand how to make sense of how my pain and my suffering was holding me back, and how I could turn that really into something that could help me thrive.
SAMI: Yeah, I think that's really powerful. I think that's a really powerful place to be able to come from, you know, to be able to kind of take your own challenges and the things that you've struggled with. And it's kind of like almost sometimes you have...the best teachers are the people who've gone through it themselves. And I can imagine that's been quite a journey. If only we had a longer podcast, right?
SARA: [chuckles]
SAMI: We could go into all our journeys. But it's super interesting. And, specifically, what has then kind of propelled you more towards looking into the tech aspect of it, right? So, I'm assuming...well, AI, at least, is relatively recent. And so, I'm assuming when you started out, it was more, like you're saying, a therapeutic setting, a life coaching setting, and now there's this sort of other angle, which is kind of coming into it. So, how did you end up getting involved or interested in the tech or the AI side?
SARA: I am an entrepreneur myself. When we go into what we call private practice, there is an element of business that most of us don't know. They don't really teach you business in social work school, and I kind of had to figure it out. But what really pushed me off that ledge to just figure it out and fly was COVID. And I, you know, went from a traditional office with the couch to being virtual. And it was going to be temporary, but I made the decision, and it was quite a difficult decision, given what I had already experienced in helping people through that transition, you know, going from traditional office spaces to at-home working.
But it was, yeah, I really had to understand the impact of technology on my practice, let alone my life. Working from home is a very different lifestyle when it comes to understanding what mental health means. You know, working from home and brain health is a big focus of, you know, what I discuss with my clients and educate them on. But more recently, and this is kind of how I got into the conference, when I started realizing that a lot of my own mental health was...I needed an outlet of creativity of something to be able to help me cope. I realized my business, and my content, and my passion could be that. So, I had to figure out how to scale myself.
And I'm still learning AI. I have an assistant, and she helps me. I have to use her to help me use ChatGPT because it is a beast if you don't know not only just learning the program but learning how to use it and also for it to really be authentic and not necessarily something that, you know, the bot just develops content for you, and you don't make it your own. So, it's a big old brain twister. And the concept of perception is very delicate, let alone with AI. But when you bring it into the tech world, it's a completely different type of language.
RÉMY: Since you started working with AI, you mentioned ChatGPT, have you noticed answers or generated content that is either incredibly useful and accurate, and, on the other side, other content that might be, I won't say disturbing, but at least not exactly what you would expect from a human?
SARA: Yeah, absolutely. It kind of weirds me out to, like...because I use it to kind of help my creative flow, like, if I have a blog post that I need to write. And it's very important for me to, you know, bring myself into my writing. So, when I started with ChatGPT, and it brings up something, and I'm like, who ever says that? Like, no one says that. Like, that's completely maybe like, you know, just it's a little bit unrelatable and a little stiff, I guess, is the best word I can use.
And then, I go through the processing of like, okay, let me figure out how I would write this. I feel like it does help me. It does prolong the process a little bit more. But I have also, yeah, so just kind of relatability factor, for me, is the first thing that sticks out.
But the other thing that I've learned a little bit more about listening to, you know, other podcasts and just trying to educate myself, which is a funny term because we use this, you know, in my field of mental health all the time, is it comes up with hallucination. So, it will fill in gaps of, you know, whether it be data, or in a statistic, or whether it's just a concept that it kind of makes up to kind of fill and have fillers in what it produces, which I'm still new to understanding what that really means. Like, yeah, it definitely can be some...and it needs to be something that we fact-check as well since it's just pulling from the general abyss of the internet, and that's not always the most accurate, you know, place of reference in general.
SAMI: Yeah, I can vouch for the abyss of the internet not always being the best place to find yourself [laughs]. There's some rabbit holes we've probably all been down. But it's so interesting because I find that the world has woken up to the impact that social media has had on everyone's mental health. And it almost feels like that was our first experiment with how tech can really impact us as a society and as individuals. And so, we've kind of seen that experiment and how that's played out, and I would argue we've probably failed. We've probably had this social media wave. And whether you'd look at it from a government perspective or a healthcare perspective, I don't actually think we've handled it well.
And it's almost, like, now we're on the cusp of our second experiment, right? This is now, okay, no longer social media. I mean, that is still relevant but put that to the side for a second. And you've got AI coming out with all these chatbots, generative AI, whether that's across images, text, and the impact that is going to have. So, I feel like the space that you're in is huge. I think you spoke before we started recording about, like, there's a mental health crisis. What do you see, or what concerns do you have given what we've seen with social media, the impact AI can have on our mental health?
SARA: You know, there's a lot of different points here, but I think I'll just go with the first thing that comes to my mind is the limits. There are not many limits, let alone...so, tech in itself, but just in our own natural human world, as individuals, we have to learn what boundaries are. We have to learn self-imposed limitations or else someone else is going to impose them on us, and that just doesn't feel as good when someone puts their own limitations on our reality.
So, when we bring this into tech,...and I also include...since my background is in addictions, I started realizing that correlation between, like, technology, the boom of access to information is really...it's a pleasure concept, is that when we have a thought and we can just go get information about the answer and it's immediate, that immediate gratification teaches the brain like, oh, I can do this. I can handle this myself. We're not looking at the by-product of that anymore. And I think because we're dealing with it, we can't really...we're so in it now. We can't see that like, oh, this could potentially be a problem, because it is.
We have become an immediate access world. I mean, even in rural...like, kids in Africa have a TikTok dancing. And they don't have running water in their communities, but they have a cell phone where they can get support. Like, I'm glad they can because that's great access. But they're not necessarily realizing the addictive aspect of what just being interconnected this way has on the brain, let alone the foundational understanding of what boundaries, and self-discipline, or just mental discipline would look like.
So, then when you bring this, I think, into the, you know, the AI world, we're already on a shaky ground of abuse of information and having too much information and not knowing how to process it. And I think that's probably been...I know an issue, for me, is that when I have too much information, I can't necessarily ask questions very well because I'm like, what is the question? Like, I know my brain is oversaturated, essentially, with information as well as potentially chemicals at this point because I'm just working so fast, so fast, so fast.
And I'm in my mid-thirties at this point. So, a teenager who's already dealing with impulse control issues because they're naturally developing, that gets really complicated very quickly. And that's what, in turn, we call attention deficit disorder, anxiety, autism spectrum. That's a little bit more complicated, but a lot of that intersects to be like, well, what are we dealing with? We're dealing with immediate gratification and a sensory processing issue because we're looking at screens, and our brains don't know how to adapt to that let alone regulate that.
SAMI: That makes so much sense. I guess it's because it's kind of a world that we all inhabit, right? As much as we talk about this and sometimes we like to think of the other like we're talking about someone else, I've found this in my own life as well. I'm addicted to my phone in ways, and I'm also seeking that immediate gratification. And it's almost, like you said, that dopamine hit, right? If there's a piece of information I want or there's a video that I want to see, it's there, and it's immediate.
And when you say these things, I guess it's kind of...it's a bit scary. And then, I wonder, on a more macro level, why, as a society, do we do this to ourselves? I don't expect anyone on this podcast [chuckles] to have the answer, right? But I'm always interested, like, if we're aware of this and we're cognizant of what's going on, and, Rémy, feel free to jump in on this as well, like, as a society, why are we doing this to ourselves?
SARA: Now, by no means is this...like, this is just my answer, and I don't have the answer for everything. But I've had...sometimes as a therapist, you have to fill space and come up with an answer. So, my hypothesis is that it's natural human behavior. I think our brains...we are, you know, survival of the fittest. That's natural. Like, at the end of the day, we're going to fight for our life. And life really comes down, in my perspective, it comes down to, like, we have suffering, and we have pleasure.
However, we've learned now that as an evolving, you know, species, that we are one of the only species that can build executive functioning skills in our brains and have different parts of that that we have to kind of understand the baseline. Survival has gotten us so far, and we've made a lot of great headway with that. But pleasure is not sustainable. Pleasure is a beautiful concept to have in life. But when we talk about what's the goal of life, we want to be happy. Happy and pleasure are actually two very different things to the brain. And a lot of it is just a matter of space being used.
Pleasure and dopamine is actually a very small part of the brain, whereas happiness expands and is able to circulate chemicals, and synapses, and energy throughout the rest of the brain but that it has to be a conscious choice. And I think a lot of people don't realize, yeah, you're making choices. I'm not saying, like, no one doesn't have, you know, some degree of free will, but if you're dealing with any degree of stress, emotions, cognitive bias in general, you're not making an actual, like, expansive choice about what options you have to expand your consciousness and your brain capacity.
RÉMY: I like the way that today we realized that a lot of things related to this is chemicals that we all have, which remove a little bit the guilt when you are addicted, you know, because it can happen to anyone. But also, it's a reminder that it can happen to anyone. So, nobody is immune to that because that's how we're built. And I really like this approach. It's just natural, which means it's okay to feel it. But it's also dangerous to anyone, so anyone should address it. And, again, if you feel like you're losing it and losing to addiction, it feels good to just know that everybody is entitled to, unfortunately, to feel that at some point in their life.
SARA: I love that you mentioned that, and that's absolutely one of my goals is to break down the stigma of...when I use the word addiction...and I don't do small talk that well because I'm just like, let's talk about some real things here. This is what's going on. And it's scary to think of, like, addiction and what that means because of how we've seen it. And I don't know what it looks like particularly in the countries that you're from...a little bit. But I know, here in America, it's messy. It's hurtful. It's a lot of suffering.
It doesn't make us feel good to even think about that, which is why I try to teach my clients how to manage and regulate that because it does not discriminate. It's your brain. It's doing its natural thing and how you have to train and just learn how to train that. And it can get better, for sure. But yeah, I really try to break through, like, it's not something that we need to keep being scared about because that is actually what gives it its power. It gives that restrictiveness and that isolation and breaks that connection from each other. And that's ultimately what brings us out of an active addictive cycle is connection.
SAMI: Yeah, it's really interesting because technology it almost masks that by making you feel really connected. Like, I'm connected to all these people and all these things, but I don't feel that connection. And that really resonates with me when you talk about the difference between pleasure and happiness.
So, I hope my parents don't listen to this. But when I was in university, I'm pretty sure I had a gaming addiction. So, I used to live in the loft in my house. I don't know what you'd call it in America. Maybe it's called the attic. I was at the top floor. So, essentially, I had...oh, back then, it would have been a PS3, and I was seriously addicted to Call of Duty, playing online.
And I remember doing just all-nighters, like, really often. I remember it got to a point where I would almost have to reset my whole sleep cycle because I ended up in a situation where I'd be awake in the night kind of always playing all night because I couldn't put the game down, and then sleeping during the day. And to get myself back into a normal rhythm, I'd have to force myself to stay awake for 24 hours. And I would even consider myself someone who doesn't have an addictive personality.
But when you were saying about the difference between pleasure and happiness, like, it was definitely hitting that dopamine, and it was pleasurable, but I didn't feel happy. Like, once I stopped, then there was all those feelings that Rémy described, which is, oh no, what have I done? I've wasted so much time and all that guilt that comes with it. So, it's really interesting.
And I guess it's also a bit like a codependency, which is something I've seen that you've touched on in your work as well, which I understand to be an unhealthy reliance on a human relationship. But I'm guessing we're probably seeing more of that and unhealthy reliance on tech software products and AI. Is that something you're seeing in your therapeutic work as well?
SARA: Oh, absolutely. Codependency it's a big topic to unpack. And I'll say it's a balance. We're never going to not be codependent on something because it literally...we're supposed to work together. We need each other to survive and to grow. But the unhealthy parts of it is, I think, because...I'll just speak from my own experience. I was never taught what emotional intelligence was when I was a kid. I grew up in a very middle-class, non-diverse part of the United States, where I didn't understand the foundational, like, what are boundaries? What are emotions? They try to teach you.
And I think that's been something that is going to take people a while to understand. But there is an unhealthy part of it because it's just mixed with...and confuse people of what do we actually need to need other people for. And it naturally sends us...I think this is primarily where relationships become a point of the discussion is relationships are necessary. But they're less successful if you don't have a relationship with yourself as a foundation because that's naturally going to help you realize that you don't need this one person. And you don't attach to a person out of necessity and out of survival or else, yeah, you're going to lose a huge aspect of your identity because you didn't have much of one to begin with.
And so, that's ultimately what I teach and educate people on when I work with them in session is just what codependency really is. We're going to be codependent on something. I'd rather you be aware of it. Denial is just dangerous in general. But being aware of how these things show up, you have a better of a choice now. And free will comes back to really in your control without less consequence over time or less negative consequence over time. [inaudible 20:44] my brother, though, Rémy. Call of Duty...[inaudible 20:48] the attic, it was the basement, but yeah. It doesn't discriminate against gender, but for men...he's also in the military. So, it was a very good outlet for him before he went, you know, active duty or [inaudible 21:02] and just self-expression. You don't have to talk about things.
I don't think this discriminates against country by any means, but I know for America, I try to stay in my lane with just speaking about Americans, is that men have been put in a very tough position when it comes to mental health because society reinforces: keep it together; be the provider; just deal with it, and painted this picture of, like, you don't have and can't express emotions. And then, we wonder why guns are an issue. We wonder why drinking and alcoholism is an issue and, you know, in the male population.
MID-ROLL AD:
Are you an entrepreneur or start-up founder looking to gain confidence in the way forward for your idea? At thoughtbot, we know you’re tight on time and investment, which is why we’ve created targeted 1-hour remote workshops to help you develop a concrete plan for your product’s next steps.
Over four interactive sessions, we work with you on research, product design sprint, critical path, and presentation prep so that you and your team are better equipped with the skills and knowledge for success.
Find out how we can help you move the needle at: tbot.io/entrepreneurs.
SAMI: So, there's a lot of concerns. There's a lot of worries, and there's, I guess, negativity around tech and AI. Is there any silver lining? You know, some things we're getting from tech as we already know it, perhaps social media but also AI. Is there anything that we can look at and be like, actually, that will enhance our mental health, improve our society? Are there any positive things that you see coming from it?
SARA: I love that question because it is a heavy conversation. I tell people, if you're considering therapy, like, you got to consider it's a full-time job to intentionally lean into the heaviness of the reality that we live in. There's a lot going on right now. I kind of surprise myself every day as to why I do have some degree of hope. But I think that's also just because I see people recover every day. I am grateful for that because not everyone gets that experience.
If you're working more of a tech job and you're looking more at coding, and data, and screens all day, you don't see change from the human perspective. And if anything, if you go outside these days, it's just tense. We're extremely inflamed. I don't care what country you live in. We're all experiencing the sensory, like, I see it as...I was not good at chemistry, but one's like, when you heat up molecules and they move really fast, like, that's combustion. And it's about to be summer, so it's about to even be literally hotter. I'm not going to say it's going to get worse.
But I say that to say I do believe there is a degree of hope because not only do I see it, but I also see...I'm connected to communities who are doing work from what I kind of...is stealth and, like, covert. You're not really going to see goodness and kindness in the abundance of negativity and darkness, but it is there. And I also like to say that I educate people every day where it changes...maybe not everyone's going to change in that regard. But as an individual in a network and a system, if one person's changed, the system automatically changes.
And little by little, over time, I think the pendulum will swing back in a place where it's like, oh yeah, no, it really is happening. And I also kind of see it in this mental health crisis. Change comes out of crisis. It's unfortunate, but if we don't have a big enough reason to look at something, then we tend not to fix it, you know, be proactive. I mean, one of my goals is to get from this reactive place into a proactive and preemptive, you know, wellness space for people, but that also does have to be choice.
But I think it really has to start with people understanding and committing to themselves and taking care of themselves, which is why I also am hopeful is because that's a lot easier than trying to get other people to change for you. If you can commit and commit to yourself, and taking care of yourself, and prioritizing you being self-focused, not necessarily selfish because a lot of that gets a bad rap of like, oh, I'm being selfish. We do it a lot out of defense, which is why I think it's not that effective.
And so, like, oh, I'm just going to be selfish. I'm going to do what's best for me. You're also locking...you're doing that out of reaction typically because you're not realizing like, oh, I feel hurt because this person didn't prioritize me, so now I'm not going to prioritize them. I'm going to prioritize me. And what I mean intention and recovery comes down to like, when people hurt you, you still have to choose not to hurt them and not pull away. And so, I think if we can all understand...and it's a tough concept to stay in your lane. It really is. But if we can all try to stay in our lane and focus on taking care of ourselves, that is what I believe is going to make the most impact.
SAMI: And do you feel AI could help with that? Could we use AI? I'm interested how, like, specifically with regards to the tech, could that be part of this?
SARA: Absolutely. I think there's some foundational knowledge that needs to be done and work that needs to be done in each individual before AI can just kind of come in without creating, like, more intense dependency on it. But I know there are agencies here locally. I can't remember the name of...I was trying to remember this earlier. I met them at a networking event recently, but an agency who uses AI to help with social anxiety and role-playing when it comes to situational circumstances and exposure.
So, me as a therapist, I love doing exposure. I, for the most part, am an exposure exercise for some people is, you know, we open up and talk about, like, these things that people don't feel safe to talk about with their general networks. But AI, I've started kind of dabbling in, you know, I have some clients who deal with, you know, some, like, delusional disorders, schizoaffective disorder, where they didn't grow up in families where any of the, like, really important foundational concepts were discussed, or they were shut down. So, they're naturally trained to just stay in their head. And, in turn, you build a distrust with all the thoughts you have in your brain.
And I encourage my clients to have conversations with ChatGPT just to be like, "Hey, what's up? What's going on?" And telling it what it is that they need, to just normalize the communication of being like, okay, I'm a little nervous to go on a date. I don't know what to say. Can you help me with some ideas of what questions to ask to get to know someone? That, I think, is a lot less intimidating sometimes talking with me because my energy is easily transferable. And that can scare some people because I can get quite excited about, like, "Yo, you did that. That's great." And they're like, "Whoa, that's a lot."
SAMI: I'm loving your vibes.
SARA: Oh [laughs].
SAMI: It's good energy. I'm enjoying it.
SARA: Well, thank you. But I have a couple of clients that...talk about an investment, and I've told them this, and I was like, "I am going to pour into you." Because they just never had certain experiences at the right time to build a degree of confidence that would get them to the next place in life, where they realized like, oh, I can do that. Failure is not that bad, and it's different for everyone. But I do think AI can help in that regard.
It also can become a little bit challenging. I had a discussion on this with a colleague of mine who works in cybersecurity, and we were talking about AI and the intersection in relationships and the impact on intimacy in relationships, mostly with heterosexual relationships. But there, yeah, it can go a very different direction than hopeful. And it can cause harm or conflict in some relationships because it's easier to talk to a very structured computer bot than it is to a woman per se. But I think it can help as well to build a foundation for people to get to those points where you can be assertive and reflective in your experiences, build emotional intelligence over time to help relationships.
RÉMY: At thoughtbot, we have worked on projects that implement AI, and we are becoming more familiar with training models. One thing that concerns us is doing this in an ethical and safe way. What tips would you have for people who are actually creating models and driving change in this space?
SARA: I'd say the first thing that comes to my mind, though, and this is kind of going to go into my talk during the conference, how do you know you're connected with your own reality? I think that's the hardest part about the tech world is like, it's the boundary. Your brain does not know the difference between a computer screen and your reality. The biggest difference is your senses. And that's kind of been the...it's what's caused a lot of the problem with tech is that, you know, here we're having this conversation. I can see y'all. I can generally take into account what your environment is like, but I can't experience it the same way as if I was not sitting in the room with you.
And I think that is when you teach people how to activate their own realities, you know, teach them about their body and the somatic work, especially with trauma. When trauma is involved, is you have to know how to activate the here and now and train your brain to know what your reality is or else you're going to get lost in the sauce of, like, everyone else's reality, let alone opinions, but especially in the virtual world.
So, being able to know your sensory activation, how mindfulness is, that's a huge term, honestly. We could unpack that for 30 minutes itself. But that sensory activation is a huge part of mindfulness is being able to experience a thought that can trigger something of a reaction and being able to effectively detach from it without judgment, you know, it's training. It takes a lot of training, but senses are huge, and being able to, I think, ethically venture into that world of, you know, using the virtual space, using AI to train and be effective.
SAMI: Yeah, I want to pick up on something you said before because it kind of scared me [laughs], which I don't mind [laughs] saying that to you, right? Because I've got this fear that probably other people have also considered as well is people say about AI taking jobs. So, as a coder, we know AI is becoming more proficient at coding. Maybe other designers, other people in the tech world have this fear as well so much so I actually mentioned this in a previous podcast.
I taught myself some, like, real physical skills because I thought when AI takes my job completely, well, at least I'll be able to do something. I actually taught myself to silicone a bathroom. And if you know, you have those silicone beads that kind of go around a bathroom, so the water doesn't get in between the grating and the tiles. So, I remember when I was learning it, thinking, well, if AI does take our jobs, at least I'll be able to do this. But that's where my brain goes sometimes.
And then, when you were mentioning about using it in a therapeutic setting, like, oh, well, it can actually be helpful to chat to an AI bot about certain scenarios that you might be trying to work through in therapy. So, I guess the question is twofold. Number one, do you see AI having a big impact in a therapeutic setting and coming in and almost disrupting that industry? And also, what tips do you have for the majority of people who are now concerned that what is life going to look like, and what is it going to be? And will we all have jobs?
SARA: I think what's important is to understand what happens to the individual when fear is at play before we can even get to the bigger question of like, will AI take our jobs? But I'll start from the end. There will be some jobs that are taken by AI. But what you're talking about Rémy is, yes, there is a huge power to know how you can connect with your own life and AI. Even if you have a job that is in tech and can be overrun by AI, you still have value as a human being. However, you're not going to feel that way, one, if you have a lot of fear because we have to understand why you can't connect with that.
But because value is an invariant, to value something, you have to be quite intentional with training your brain to understand value, and you can do that if you know what fear does to your brain, and it's...quite simply, we've all heard it. It's the stress response: fight, flight, freeze, or fawn. So, when you're having a perceived threat, it doesn't mean...external is not the only threat. We threaten ourselves all the time with our own thought process based off of the experiences that we've had and trigger our own fear.
And your brain essentially is like, hold up, no, we're not going any further. There's a risk here. We're going to stop you. So, this is where a lot of people, like, have those moments. I could stare at a wall for, like, 10 minutes, but it's actually, like, almost two hours if I'm stressed out to the point...because I'm processing too much information, but it's also triggering a stress response for my brain. And we just get saturated and stuck in that moment. So, being able to know, okay, this is happening, then we can actually come back online.
So, I use the brain as a computer metaphor quite often. And when we know that we're in that fight or flight response mode, we can in turn engage so that...I actually have an acronym for fear that I'm going to be debuting at this conference. I'll just go ahead and debut it for y'all, a little sneak peek since you guys may or may not be able to be there. So, that fear is...we usually as an acronym, if you've ever heard this, is F everything and run.
And I'm going to define it as the F would be the fight response, fight, flight, freeze, and fawn. So, if you know that's going on, you can address it. The E would be engage. So, engage with your present moment, and that's where your body is, the one thing that tends to be in that present moment. And then, the A is accept. A lot of times we have to accept that we're maybe stuck. We may be at a problem. We may need to take a break. Accepting the things that we can't change in that moment is going to make a big difference on how we come back online on our brains and be able to understand, like, AI is a threat, but it's not going to take over everything, right? And then, the R is redirect. So, redirect to something that is going to change your perception of what the original trigger was.
And so, I think if people can understand how they work and how the brain is actually self-protecting, it's very, like, it's like, whoa, we're not going to let you do something completely destructive. But it cannot distinguish the severity of the threat, let alone can it not...actually, there's a lot of people who've trained their brain to not experience fear. Fear is what is supposed to keep us safe. So, it is just perceiving like, hey, AI could take our jobs, but it's also not giving you the context that you brought up, Rémy, about it's not going to take everything from us. It's actually supposed to be here to help us in [inaudible 36:23]. And it's also dependent on us.
So, if we're creating fear in the AI, then yeah, it's going to learn that, and it, could, I don't know, I can't tell the future in that regard. But we have plenty of things that don't have to be tech-related that AI won't take from us. And a lot of that is the natural world if we can keep it alive and value it enough.
RÉMY: I have one question for you. It might not be very in sync with the train of thought we're having because it's more related to the beginning of the episode. But you mentioned sometimes rebuilding confidence with people and building confidence and building the ability to trust yourself and to make your own good decisions. It feels like, to some extent, it can be rebuilding yourself. How do you deal with such a big action, such a big project? I mean, it's something that could take life to do so.
SARA: That's a great question. And sometimes it will take people's lives. I don't handle the whole rest of their life. I tell people like, "I'm going to give you some foundational things." And I do a lot of training. I'm very direct, which is why I have that therapeutic life consultant of like, I'm going to take my vast amount of experience, things that people are probably not going to experience and help them build a security in themselves and, over time, prepare them for when they deviate from that.
I tell people, like, especially if you have loved ones still living, depression is never going to just leave. The concept...there's no cure. It's being able to be prepared for when things happen in life versus feel completely unprepared. I just came out of a season of grief of, you know, I walked away from a relationship, as well as then trying to still maintain my business, still trying to maintain my clients and those relationships, let alone the relationship with myself, and then put my cat down, you know, like, you know, he was a child. I had him for 14 years. So, like, life is going to continuously happen. So, I'm not trying to figure it all out, but I'm trying to get people back to a point where they can understand how to find security for themselves.
Since mental health has been such a taboo topic for a long time, there is quite a bit of backlog, and that's what we're seeing. I don't know what it looks like in y'alls countries, but here in America, there is this rush of people. I need a therapist. I need to go to therapy. And we're at a shortage. Therapists can't necessarily help all, like, at once. And we also have to maintain our own mental health, or we're not going to be very helpful to people.
But really, it comes down to how you build that security with yourself and know and not anticipate, but be prepared for when there's something else that happens that disturbs your own peace. Because if you have an understanding of what peace looks like for you, and you can't necessarily control it, but you can influence it, and facilitate it in your life, then you have a stronger foundation to be able to endure, you know, potential loss of a loved one, hopefully, no time soon for anyone here, or out, or listening, but it's just the reality.
And that's part of, you know, my story of, I experienced a lot of loss from a young age, and it worked against me for a long time because I had no idea how to process and regulate energy and emotion in my body. And so, what it looked like was me holding on to repressing anger, not having a relationship with the natural emotions that we can't get rid of. You can't get rid of emotions. I wish I could just, you know, vomit them out and just be done with it and be like, okay, cool. We can all be stable. That's just not...that's not going to happen. I think that also is what makes us, you know, a great species and building, you know, great things in this world is emotion.
Tech was built off of passion and emotion. Did it cause some disarray and probably hurt some people in the process? Yeah. But I think we can reduce that from happening if people understand emotional intelligence and not just work, work, work, work, work. It's a new age coming to that. And I've, hopefully, been working on myself enough to be able to sustain helping people understand and shifting over to that new type of perspective of we can't do things the way that we've been doing them. We just can't. It's not sustainable. The human species will suffer from it and the earth will as well.
SAMI: Yeah, thank you so much, and just for bringing that level of transparency and honesty. It resonates with myself, and I'm sure it will help so many other people who are listening. We could talk to you for hours. I mean, there is so much. And some things we just did not have time to get into. But thank you so much for the time that you've given us. And it's been really insightful to look at AI and tech that we work with as consultants at thoughtbot on a daily basis from this perspective and look at it from this angle. If people want to get a hold of you, where would be the best place?
SARA: Finding my website is a big thing. That's just, you know, kind of the portal. So, that's sarawilderlcsw.com. Sara with an out an H. And then, also, venturing into this tech world, I have an app interface now that I have put together to kind of be a centerpiece for mental health resources, not only just, like, hotlines. That information is on my website as well.
But if you want to start doing your own work little by little, you know, having a centralized spot as well as not too much information. There's plenty of stuff you can Google about mental health. But this is vetted by me and organized to a point where they can, you know, one worksheet can make a difference, where you're just reflecting and taking, you know, 10-15 minutes to read through it and see how you can apply it in your life. It's called Power in Perspective.
SAMI: That's great. Definitely, I recommend go and check it out and check out Sara on her website. And if you can get down to that conference, that is, again, North Carolina called CreativeVerse, and you'll have the opportunity to hear Sara in person as well as Fatima from thoughtbot who's also presenting.
If you learned nothing else from today, then just remember: fear has an acronym for F everything and run. I guess that's my big takeaway. You also got a chance to hear about my gaming addiction. No one tell my parents.
And you can find notes and a complete transcript for this episode at giantrobots.fm. If you have questions or comments, you can email us at hosts@giantrobots.fm. I always leave you the same challenge, and that challenge is to subscribe. We've got some great guests lined up, and you'll hear about it first if you subscribe. And feel free to leave any comments on Spotify or Apple Podcasts. We do check them all, and they're really helpful.
This podcast is brought to you by thoughtbot and produced and edited by Mandy Moore. Check her out at mandymoore.tech. Thanks for listening. See ya.
AD:
Did you know thoughtbot has a referral program? If you introduce us to someone looking for a design or development partner, we will compensate you if they decide to work with us.
More info on our website at: tbot.io/referral. Or you can email us at: referrals@thoughtbot.com with any questions.
Sponsored By:
- thoughtbot: Are you an entrepreneur or start-up founder looking to gain confidence in the way forward for your idea? At thoughtbot, we know you’re tight on time and investment, which is why we’ve created targeted 1-hour remote workshops to help you develop a concrete plan for your product’s next steps. Over four interactive sessions, we work with you on research, product design sprint, critical path, and presentation prep so that you and your team are better equipped with the skills and knowledge for success. Find out how we can help you move the needle at: tbot.io/entrepreneurs
559 قسمت
Manage episode 431826534 series 2576605
In this podcast episode of "Giant Robots On Tour," hosts Sami Birnbaum and Rémy Hannequin explore mental health in the age of artificial intelligence with Sara Wilder, a Therapeutic Life Consultant and Licensed Clinical Social Worker. Sami shares his own brief foray into psychotherapy before transitioning to tech, highlighting the relevance of mental health in today's rapidly evolving technological landscape. Sara, whose path to therapy was influenced by her personal struggles and a desire to help others, discusses her unique approach as a Therapeutic Life Consultant, which blends traditional therapy with direct coaching and consulting.
Sara elaborates on her journey and how the COVID-19 pandemic pushed her towards integrating technology into her practice. She transitioned from in-person sessions to virtual consultations, emphasizing the impact of this shift on mental health and brain function. Sara's interest in AI stemmed from her need to scale her business and her desire to use technology to aid her clients. She discusses her experience with AI tools like ChatGPT, both the benefits and challenges, such as generating relatable content and addressing AI "hallucinations." Sara highlights the importance of using AI ethically and maintaining human oversight to ensure the authenticity and accuracy of AI-generated outputs.
The conversation also delves into broader concerns about the impact of AI and technology on mental health. Sami and Rémy discuss the addictive nature of technology and its parallels with substance addiction, emphasizing the need for self-imposed boundaries and emotional intelligence. Sara shares insights into how AI can be a valuable tool in therapy, such as using AI for social anxiety role-playing or to generate conversation prompts. The episode concludes with a discussion on the balance between leveraging AI for efficiency and maintaining human connection, stressing the need for ongoing education and ethical considerations in AI development and deployment.
- Follow Sara Wilder on LinkedIn. Visit her website: sarawilderlcsw.com.
- Follow thoughtbot on X or LinkedIn.
Transcript:
SAMI: Yes, and we are back. And this is the Giant Robots Smashing Into Other Giant Robots podcast, the Giant Robots on Tour series coming to you from Europe, West Asia, and Africa, where we explore the design, development, and business of great products. I'm your host, Sami Birnbaum.
RÉMY: And I'm your other host, Rémy Hannequin.
SAMI: Okay, if you're wondering where Jared is, we finally got rid of him. No, that's a joke, Jared, if you're listening. He was my previous co-host. You can go back to our other podcasts. But we've got Rémy on board today. And you could take a look at our previous podcast, where we introduce the Giant Robots on Tour series, where you'll find out about all the different co-hosts. And you can learn more about Rémy's sourdough bread.
Joining us today is Sara Wilder, a Therapeutic Life Consultant, Licensed Clinical Social Worker, and Clinical Addictions Specialist.
Okay, Sara, this is going to sound a little bit strange, but, actually, once upon a time in my own life, I kind of wanted to be you, not exactly you because that would be even more strange.
SARA: [chuckles]
SAMI: But before I got into coding and tech, I was interested in psychotherapy. And I started a course and, for different reasons, it didn't work out, and I never pursued that career. But what's really interested us about you is the work and research you're doing around mental health in this new world of AI, artificial intelligence. You have a really interesting talk coming up at the CreativeVerse Conference in North Carolina. And we actually have Fatima from thoughtbot who's going to be presenting at the same conference.
And you're specifically talking about prioritizing mental health in the age of AI. And there is so much we want to ask you about this. But before we do, I always like to go back to the start with my guests. Everyone has a story, and I'm interested in your journey. What led you into the world of therapy?
SARA: Well, to unpack that, it's, like, probably way too long for this podcast, but in a nutshell, I had no idea what...I did not want to be a therapist when I grew up, so thank you for wanting that more than me. But I landed here, I think, partly just because of, you know, I always wanted to help people. I never really knew what that was going to look like. I thought it maybe was going into nursing or more of the medical side. But really what landed me here and made me stay here and really choose to stay in my profession...because, at one point, I was like, no, I'm not sure I could do this for the rest of my life; this is a lot. But it was really my own suffering.
I had to take a really hard look at where I came from, what I had gone through, and why I wanted to just, you know, like, help people, but then try to keep changing how I did that. And I'm glad I chose to stay put in this kind of therapeutic, you know, life. Therapeutic life consultant is a term that I kind of formulated myself because I'm not quite a traditional therapist anymore. I'm not sitting in an office with the couch. We talk a lot about our relationship with our mothers.
But I have more of a personality that's direct and kind of coaching. And I want to go more into consulting and help people understand how to do their own healing work using my clinical background of being in diagnostics in different hospital settings, stuff like that. And because I had to do my own work, and I had to understand how to make sense of how my pain and my suffering was holding me back, and how I could turn that really into something that could help me thrive.
SAMI: Yeah, I think that's really powerful. I think that's a really powerful place to be able to come from, you know, to be able to kind of take your own challenges and the things that you've struggled with. And it's kind of like almost sometimes you have...the best teachers are the people who've gone through it themselves. And I can imagine that's been quite a journey. If only we had a longer podcast, right?
SARA: [chuckles]
SAMI: We could go into all our journeys. But it's super interesting. And, specifically, what has then kind of propelled you more towards looking into the tech aspect of it, right? So, I'm assuming...well, AI, at least, is relatively recent. And so, I'm assuming when you started out, it was more, like you're saying, a therapeutic setting, a life coaching setting, and now there's this sort of other angle, which is kind of coming into it. So, how did you end up getting involved or interested in the tech or the AI side?
SARA: I am an entrepreneur myself. When we go into what we call private practice, there is an element of business that most of us don't know. They don't really teach you business in social work school, and I kind of had to figure it out. But what really pushed me off that ledge to just figure it out and fly was COVID. And I, you know, went from a traditional office with the couch to being virtual. And it was going to be temporary, but I made the decision, and it was quite a difficult decision, given what I had already experienced in helping people through that transition, you know, going from traditional office spaces to at-home working.
But it was, yeah, I really had to understand the impact of technology on my practice, let alone my life. Working from home is a very different lifestyle when it comes to understanding what mental health means. You know, working from home and brain health is a big focus of, you know, what I discuss with my clients and educate them on. But more recently, and this is kind of how I got into the conference, when I started realizing that a lot of my own mental health was...I needed an outlet of creativity of something to be able to help me cope. I realized my business, and my content, and my passion could be that. So, I had to figure out how to scale myself.
And I'm still learning AI. I have an assistant, and she helps me. I have to use her to help me use ChatGPT because it is a beast if you don't know not only just learning the program but learning how to use it and also for it to really be authentic and not necessarily something that, you know, the bot just develops content for you, and you don't make it your own. So, it's a big old brain twister. And the concept of perception is very delicate, let alone with AI. But when you bring it into the tech world, it's a completely different type of language.
RÉMY: Since you started working with AI, you mentioned ChatGPT, have you noticed answers or generated content that is either incredibly useful and accurate, and, on the other side, other content that might be, I won't say disturbing, but at least not exactly what you would expect from a human?
SARA: Yeah, absolutely. It kind of weirds me out to, like...because I use it to kind of help my creative flow, like, if I have a blog post that I need to write. And it's very important for me to, you know, bring myself into my writing. So, when I started with ChatGPT, and it brings up something, and I'm like, who ever says that? Like, no one says that. Like, that's completely maybe like, you know, just it's a little bit unrelatable and a little stiff, I guess, is the best word I can use.
And then, I go through the processing of like, okay, let me figure out how I would write this. I feel like it does help me. It does prolong the process a little bit more. But I have also, yeah, so just kind of relatability factor, for me, is the first thing that sticks out.
But the other thing that I've learned a little bit more about listening to, you know, other podcasts and just trying to educate myself, which is a funny term because we use this, you know, in my field of mental health all the time, is it comes up with hallucination. So, it will fill in gaps of, you know, whether it be data, or in a statistic, or whether it's just a concept that it kind of makes up to kind of fill and have fillers in what it produces, which I'm still new to understanding what that really means. Like, yeah, it definitely can be some...and it needs to be something that we fact-check as well since it's just pulling from the general abyss of the internet, and that's not always the most accurate, you know, place of reference in general.
SAMI: Yeah, I can vouch for the abyss of the internet not always being the best place to find yourself [laughs]. There's some rabbit holes we've probably all been down. But it's so interesting because I find that the world has woken up to the impact that social media has had on everyone's mental health. And it almost feels like that was our first experiment with how tech can really impact us as a society and as individuals. And so, we've kind of seen that experiment and how that's played out, and I would argue we've probably failed. We've probably had this social media wave. And whether you'd look at it from a government perspective or a healthcare perspective, I don't actually think we've handled it well.
And it's almost, like, now we're on the cusp of our second experiment, right? This is now, okay, no longer social media. I mean, that is still relevant but put that to the side for a second. And you've got AI coming out with all these chatbots, generative AI, whether that's across images, text, and the impact that is going to have. So, I feel like the space that you're in is huge. I think you spoke before we started recording about, like, there's a mental health crisis. What do you see, or what concerns do you have given what we've seen with social media, the impact AI can have on our mental health?
SARA: You know, there's a lot of different points here, but I think I'll just go with the first thing that comes to my mind is the limits. There are not many limits, let alone...so, tech in itself, but just in our own natural human world, as individuals, we have to learn what boundaries are. We have to learn self-imposed limitations or else someone else is going to impose them on us, and that just doesn't feel as good when someone puts their own limitations on our reality.
So, when we bring this into tech,...and I also include...since my background is in addictions, I started realizing that correlation between, like, technology, the boom of access to information is really...it's a pleasure concept, is that when we have a thought and we can just go get information about the answer and it's immediate, that immediate gratification teaches the brain like, oh, I can do this. I can handle this myself. We're not looking at the by-product of that anymore. And I think because we're dealing with it, we can't really...we're so in it now. We can't see that like, oh, this could potentially be a problem, because it is.
We have become an immediate access world. I mean, even in rural...like, kids in Africa have a TikTok dancing. And they don't have running water in their communities, but they have a cell phone where they can get support. Like, I'm glad they can because that's great access. But they're not necessarily realizing the addictive aspect of what just being interconnected this way has on the brain, let alone the foundational understanding of what boundaries, and self-discipline, or just mental discipline would look like.
So, then when you bring this, I think, into the, you know, the AI world, we're already on a shaky ground of abuse of information and having too much information and not knowing how to process it. And I think that's probably been...I know an issue, for me, is that when I have too much information, I can't necessarily ask questions very well because I'm like, what is the question? Like, I know my brain is oversaturated, essentially, with information as well as potentially chemicals at this point because I'm just working so fast, so fast, so fast.
And I'm in my mid-thirties at this point. So, a teenager who's already dealing with impulse control issues because they're naturally developing, that gets really complicated very quickly. And that's what, in turn, we call attention deficit disorder, anxiety, autism spectrum. That's a little bit more complicated, but a lot of that intersects to be like, well, what are we dealing with? We're dealing with immediate gratification and a sensory processing issue because we're looking at screens, and our brains don't know how to adapt to that let alone regulate that.
SAMI: That makes so much sense. I guess it's because it's kind of a world that we all inhabit, right? As much as we talk about this and sometimes we like to think of the other like we're talking about someone else, I've found this in my own life as well. I'm addicted to my phone in ways, and I'm also seeking that immediate gratification. And it's almost, like you said, that dopamine hit, right? If there's a piece of information I want or there's a video that I want to see, it's there, and it's immediate.
And when you say these things, I guess it's kind of...it's a bit scary. And then, I wonder, on a more macro level, why, as a society, do we do this to ourselves? I don't expect anyone on this podcast [chuckles] to have the answer, right? But I'm always interested, like, if we're aware of this and we're cognizant of what's going on, and, Rémy, feel free to jump in on this as well, like, as a society, why are we doing this to ourselves?
SARA: Now, by no means is this...like, this is just my answer, and I don't have the answer for everything. But I've had...sometimes as a therapist, you have to fill space and come up with an answer. So, my hypothesis is that it's natural human behavior. I think our brains...we are, you know, survival of the fittest. That's natural. Like, at the end of the day, we're going to fight for our life. And life really comes down, in my perspective, it comes down to, like, we have suffering, and we have pleasure.
However, we've learned now that as an evolving, you know, species, that we are one of the only species that can build executive functioning skills in our brains and have different parts of that that we have to kind of understand the baseline. Survival has gotten us so far, and we've made a lot of great headway with that. But pleasure is not sustainable. Pleasure is a beautiful concept to have in life. But when we talk about what's the goal of life, we want to be happy. Happy and pleasure are actually two very different things to the brain. And a lot of it is just a matter of space being used.
Pleasure and dopamine is actually a very small part of the brain, whereas happiness expands and is able to circulate chemicals, and synapses, and energy throughout the rest of the brain but that it has to be a conscious choice. And I think a lot of people don't realize, yeah, you're making choices. I'm not saying, like, no one doesn't have, you know, some degree of free will, but if you're dealing with any degree of stress, emotions, cognitive bias in general, you're not making an actual, like, expansive choice about what options you have to expand your consciousness and your brain capacity.
RÉMY: I like the way that today we realized that a lot of things related to this is chemicals that we all have, which remove a little bit the guilt when you are addicted, you know, because it can happen to anyone. But also, it's a reminder that it can happen to anyone. So, nobody is immune to that because that's how we're built. And I really like this approach. It's just natural, which means it's okay to feel it. But it's also dangerous to anyone, so anyone should address it. And, again, if you feel like you're losing it and losing to addiction, it feels good to just know that everybody is entitled to, unfortunately, to feel that at some point in their life.
SARA: I love that you mentioned that, and that's absolutely one of my goals is to break down the stigma of...when I use the word addiction...and I don't do small talk that well because I'm just like, let's talk about some real things here. This is what's going on. And it's scary to think of, like, addiction and what that means because of how we've seen it. And I don't know what it looks like particularly in the countries that you're from...a little bit. But I know, here in America, it's messy. It's hurtful. It's a lot of suffering.
It doesn't make us feel good to even think about that, which is why I try to teach my clients how to manage and regulate that because it does not discriminate. It's your brain. It's doing its natural thing and how you have to train and just learn how to train that. And it can get better, for sure. But yeah, I really try to break through, like, it's not something that we need to keep being scared about because that is actually what gives it its power. It gives that restrictiveness and that isolation and breaks that connection from each other. And that's ultimately what brings us out of an active addictive cycle is connection.
SAMI: Yeah, it's really interesting because technology it almost masks that by making you feel really connected. Like, I'm connected to all these people and all these things, but I don't feel that connection. And that really resonates with me when you talk about the difference between pleasure and happiness.
So, I hope my parents don't listen to this. But when I was in university, I'm pretty sure I had a gaming addiction. So, I used to live in the loft in my house. I don't know what you'd call it in America. Maybe it's called the attic. I was at the top floor. So, essentially, I had...oh, back then, it would have been a PS3, and I was seriously addicted to Call of Duty, playing online.
And I remember doing just all-nighters, like, really often. I remember it got to a point where I would almost have to reset my whole sleep cycle because I ended up in a situation where I'd be awake in the night kind of always playing all night because I couldn't put the game down, and then sleeping during the day. And to get myself back into a normal rhythm, I'd have to force myself to stay awake for 24 hours. And I would even consider myself someone who doesn't have an addictive personality.
But when you were saying about the difference between pleasure and happiness, like, it was definitely hitting that dopamine, and it was pleasurable, but I didn't feel happy. Like, once I stopped, then there was all those feelings that Rémy described, which is, oh no, what have I done? I've wasted so much time and all that guilt that comes with it. So, it's really interesting.
And I guess it's also a bit like a codependency, which is something I've seen that you've touched on in your work as well, which I understand to be an unhealthy reliance on a human relationship. But I'm guessing we're probably seeing more of that and unhealthy reliance on tech software products and AI. Is that something you're seeing in your therapeutic work as well?
SARA: Oh, absolutely. Codependency it's a big topic to unpack. And I'll say it's a balance. We're never going to not be codependent on something because it literally...we're supposed to work together. We need each other to survive and to grow. But the unhealthy parts of it is, I think, because...I'll just speak from my own experience. I was never taught what emotional intelligence was when I was a kid. I grew up in a very middle-class, non-diverse part of the United States, where I didn't understand the foundational, like, what are boundaries? What are emotions? They try to teach you.
And I think that's been something that is going to take people a while to understand. But there is an unhealthy part of it because it's just mixed with...and confuse people of what do we actually need to need other people for. And it naturally sends us...I think this is primarily where relationships become a point of the discussion is relationships are necessary. But they're less successful if you don't have a relationship with yourself as a foundation because that's naturally going to help you realize that you don't need this one person. And you don't attach to a person out of necessity and out of survival or else, yeah, you're going to lose a huge aspect of your identity because you didn't have much of one to begin with.
And so, that's ultimately what I teach and educate people on when I work with them in session is just what codependency really is. We're going to be codependent on something. I'd rather you be aware of it. Denial is just dangerous in general. But being aware of how these things show up, you have a better of a choice now. And free will comes back to really in your control without less consequence over time or less negative consequence over time. [inaudible 20:44] my brother, though, Rémy. Call of Duty...[inaudible 20:48] the attic, it was the basement, but yeah. It doesn't discriminate against gender, but for men...he's also in the military. So, it was a very good outlet for him before he went, you know, active duty or [inaudible 21:02] and just self-expression. You don't have to talk about things.
I don't think this discriminates against country by any means, but I know for America, I try to stay in my lane with just speaking about Americans, is that men have been put in a very tough position when it comes to mental health because society reinforces: keep it together; be the provider; just deal with it, and painted this picture of, like, you don't have and can't express emotions. And then, we wonder why guns are an issue. We wonder why drinking and alcoholism is an issue and, you know, in the male population.
MID-ROLL AD:
Are you an entrepreneur or start-up founder looking to gain confidence in the way forward for your idea? At thoughtbot, we know you’re tight on time and investment, which is why we’ve created targeted 1-hour remote workshops to help you develop a concrete plan for your product’s next steps.
Over four interactive sessions, we work with you on research, product design sprint, critical path, and presentation prep so that you and your team are better equipped with the skills and knowledge for success.
Find out how we can help you move the needle at: tbot.io/entrepreneurs.
SAMI: So, there's a lot of concerns. There's a lot of worries, and there's, I guess, negativity around tech and AI. Is there any silver lining? You know, some things we're getting from tech as we already know it, perhaps social media but also AI. Is there anything that we can look at and be like, actually, that will enhance our mental health, improve our society? Are there any positive things that you see coming from it?
SARA: I love that question because it is a heavy conversation. I tell people, if you're considering therapy, like, you got to consider it's a full-time job to intentionally lean into the heaviness of the reality that we live in. There's a lot going on right now. I kind of surprise myself every day as to why I do have some degree of hope. But I think that's also just because I see people recover every day. I am grateful for that because not everyone gets that experience.
If you're working more of a tech job and you're looking more at coding, and data, and screens all day, you don't see change from the human perspective. And if anything, if you go outside these days, it's just tense. We're extremely inflamed. I don't care what country you live in. We're all experiencing the sensory, like, I see it as...I was not good at chemistry, but one's like, when you heat up molecules and they move really fast, like, that's combustion. And it's about to be summer, so it's about to even be literally hotter. I'm not going to say it's going to get worse.
But I say that to say I do believe there is a degree of hope because not only do I see it, but I also see...I'm connected to communities who are doing work from what I kind of...is stealth and, like, covert. You're not really going to see goodness and kindness in the abundance of negativity and darkness, but it is there. And I also like to say that I educate people every day where it changes...maybe not everyone's going to change in that regard. But as an individual in a network and a system, if one person's changed, the system automatically changes.
And little by little, over time, I think the pendulum will swing back in a place where it's like, oh yeah, no, it really is happening. And I also kind of see it in this mental health crisis. Change comes out of crisis. It's unfortunate, but if we don't have a big enough reason to look at something, then we tend not to fix it, you know, be proactive. I mean, one of my goals is to get from this reactive place into a proactive and preemptive, you know, wellness space for people, but that also does have to be choice.
But I think it really has to start with people understanding and committing to themselves and taking care of themselves, which is why I also am hopeful is because that's a lot easier than trying to get other people to change for you. If you can commit and commit to yourself, and taking care of yourself, and prioritizing you being self-focused, not necessarily selfish because a lot of that gets a bad rap of like, oh, I'm being selfish. We do it a lot out of defense, which is why I think it's not that effective.
And so, like, oh, I'm just going to be selfish. I'm going to do what's best for me. You're also locking...you're doing that out of reaction typically because you're not realizing like, oh, I feel hurt because this person didn't prioritize me, so now I'm not going to prioritize them. I'm going to prioritize me. And what I mean intention and recovery comes down to like, when people hurt you, you still have to choose not to hurt them and not pull away. And so, I think if we can all understand...and it's a tough concept to stay in your lane. It really is. But if we can all try to stay in our lane and focus on taking care of ourselves, that is what I believe is going to make the most impact.
SAMI: And do you feel AI could help with that? Could we use AI? I'm interested how, like, specifically with regards to the tech, could that be part of this?
SARA: Absolutely. I think there's some foundational knowledge that needs to be done and work that needs to be done in each individual before AI can just kind of come in without creating, like, more intense dependency on it. But I know there are agencies here locally. I can't remember the name of...I was trying to remember this earlier. I met them at a networking event recently, but an agency who uses AI to help with social anxiety and role-playing when it comes to situational circumstances and exposure.
So, me as a therapist, I love doing exposure. I, for the most part, am an exposure exercise for some people is, you know, we open up and talk about, like, these things that people don't feel safe to talk about with their general networks. But AI, I've started kind of dabbling in, you know, I have some clients who deal with, you know, some, like, delusional disorders, schizoaffective disorder, where they didn't grow up in families where any of the, like, really important foundational concepts were discussed, or they were shut down. So, they're naturally trained to just stay in their head. And, in turn, you build a distrust with all the thoughts you have in your brain.
And I encourage my clients to have conversations with ChatGPT just to be like, "Hey, what's up? What's going on?" And telling it what it is that they need, to just normalize the communication of being like, okay, I'm a little nervous to go on a date. I don't know what to say. Can you help me with some ideas of what questions to ask to get to know someone? That, I think, is a lot less intimidating sometimes talking with me because my energy is easily transferable. And that can scare some people because I can get quite excited about, like, "Yo, you did that. That's great." And they're like, "Whoa, that's a lot."
SAMI: I'm loving your vibes.
SARA: Oh [laughs].
SAMI: It's good energy. I'm enjoying it.
SARA: Well, thank you. But I have a couple of clients that...talk about an investment, and I've told them this, and I was like, "I am going to pour into you." Because they just never had certain experiences at the right time to build a degree of confidence that would get them to the next place in life, where they realized like, oh, I can do that. Failure is not that bad, and it's different for everyone. But I do think AI can help in that regard.
It also can become a little bit challenging. I had a discussion on this with a colleague of mine who works in cybersecurity, and we were talking about AI and the intersection in relationships and the impact on intimacy in relationships, mostly with heterosexual relationships. But there, yeah, it can go a very different direction than hopeful. And it can cause harm or conflict in some relationships because it's easier to talk to a very structured computer bot than it is to a woman per se. But I think it can help as well to build a foundation for people to get to those points where you can be assertive and reflective in your experiences, build emotional intelligence over time to help relationships.
RÉMY: At thoughtbot, we have worked on projects that implement AI, and we are becoming more familiar with training models. One thing that concerns us is doing this in an ethical and safe way. What tips would you have for people who are actually creating models and driving change in this space?
SARA: I'd say the first thing that comes to my mind, though, and this is kind of going to go into my talk during the conference, how do you know you're connected with your own reality? I think that's the hardest part about the tech world is like, it's the boundary. Your brain does not know the difference between a computer screen and your reality. The biggest difference is your senses. And that's kind of been the...it's what's caused a lot of the problem with tech is that, you know, here we're having this conversation. I can see y'all. I can generally take into account what your environment is like, but I can't experience it the same way as if I was not sitting in the room with you.
And I think that is when you teach people how to activate their own realities, you know, teach them about their body and the somatic work, especially with trauma. When trauma is involved, is you have to know how to activate the here and now and train your brain to know what your reality is or else you're going to get lost in the sauce of, like, everyone else's reality, let alone opinions, but especially in the virtual world.
So, being able to know your sensory activation, how mindfulness is, that's a huge term, honestly. We could unpack that for 30 minutes itself. But that sensory activation is a huge part of mindfulness is being able to experience a thought that can trigger something of a reaction and being able to effectively detach from it without judgment, you know, it's training. It takes a lot of training, but senses are huge, and being able to, I think, ethically venture into that world of, you know, using the virtual space, using AI to train and be effective.
SAMI: Yeah, I want to pick up on something you said before because it kind of scared me [laughs], which I don't mind [laughs] saying that to you, right? Because I've got this fear that probably other people have also considered as well is people say about AI taking jobs. So, as a coder, we know AI is becoming more proficient at coding. Maybe other designers, other people in the tech world have this fear as well so much so I actually mentioned this in a previous podcast.
I taught myself some, like, real physical skills because I thought when AI takes my job completely, well, at least I'll be able to do something. I actually taught myself to silicone a bathroom. And if you know, you have those silicone beads that kind of go around a bathroom, so the water doesn't get in between the grating and the tiles. So, I remember when I was learning it, thinking, well, if AI does take our jobs, at least I'll be able to do this. But that's where my brain goes sometimes.
And then, when you were mentioning about using it in a therapeutic setting, like, oh, well, it can actually be helpful to chat to an AI bot about certain scenarios that you might be trying to work through in therapy. So, I guess the question is twofold. Number one, do you see AI having a big impact in a therapeutic setting and coming in and almost disrupting that industry? And also, what tips do you have for the majority of people who are now concerned that what is life going to look like, and what is it going to be? And will we all have jobs?
SARA: I think what's important is to understand what happens to the individual when fear is at play before we can even get to the bigger question of like, will AI take our jobs? But I'll start from the end. There will be some jobs that are taken by AI. But what you're talking about Rémy is, yes, there is a huge power to know how you can connect with your own life and AI. Even if you have a job that is in tech and can be overrun by AI, you still have value as a human being. However, you're not going to feel that way, one, if you have a lot of fear because we have to understand why you can't connect with that.
But because value is an invariant, to value something, you have to be quite intentional with training your brain to understand value, and you can do that if you know what fear does to your brain, and it's...quite simply, we've all heard it. It's the stress response: fight, flight, freeze, or fawn. So, when you're having a perceived threat, it doesn't mean...external is not the only threat. We threaten ourselves all the time with our own thought process based off of the experiences that we've had and trigger our own fear.
And your brain essentially is like, hold up, no, we're not going any further. There's a risk here. We're going to stop you. So, this is where a lot of people, like, have those moments. I could stare at a wall for, like, 10 minutes, but it's actually, like, almost two hours if I'm stressed out to the point...because I'm processing too much information, but it's also triggering a stress response for my brain. And we just get saturated and stuck in that moment. So, being able to know, okay, this is happening, then we can actually come back online.
So, I use the brain as a computer metaphor quite often. And when we know that we're in that fight or flight response mode, we can in turn engage so that...I actually have an acronym for fear that I'm going to be debuting at this conference. I'll just go ahead and debut it for y'all, a little sneak peek since you guys may or may not be able to be there. So, that fear is...we usually as an acronym, if you've ever heard this, is F everything and run.
And I'm going to define it as the F would be the fight response, fight, flight, freeze, and fawn. So, if you know that's going on, you can address it. The E would be engage. So, engage with your present moment, and that's where your body is, the one thing that tends to be in that present moment. And then, the A is accept. A lot of times we have to accept that we're maybe stuck. We may be at a problem. We may need to take a break. Accepting the things that we can't change in that moment is going to make a big difference on how we come back online on our brains and be able to understand, like, AI is a threat, but it's not going to take over everything, right? And then, the R is redirect. So, redirect to something that is going to change your perception of what the original trigger was.
And so, I think if people can understand how they work and how the brain is actually self-protecting, it's very, like, it's like, whoa, we're not going to let you do something completely destructive. But it cannot distinguish the severity of the threat, let alone can it not...actually, there's a lot of people who've trained their brain to not experience fear. Fear is what is supposed to keep us safe. So, it is just perceiving like, hey, AI could take our jobs, but it's also not giving you the context that you brought up, Rémy, about it's not going to take everything from us. It's actually supposed to be here to help us in [inaudible 36:23]. And it's also dependent on us.
So, if we're creating fear in the AI, then yeah, it's going to learn that, and it, could, I don't know, I can't tell the future in that regard. But we have plenty of things that don't have to be tech-related that AI won't take from us. And a lot of that is the natural world if we can keep it alive and value it enough.
RÉMY: I have one question for you. It might not be very in sync with the train of thought we're having because it's more related to the beginning of the episode. But you mentioned sometimes rebuilding confidence with people and building confidence and building the ability to trust yourself and to make your own good decisions. It feels like, to some extent, it can be rebuilding yourself. How do you deal with such a big action, such a big project? I mean, it's something that could take life to do so.
SARA: That's a great question. And sometimes it will take people's lives. I don't handle the whole rest of their life. I tell people like, "I'm going to give you some foundational things." And I do a lot of training. I'm very direct, which is why I have that therapeutic life consultant of like, I'm going to take my vast amount of experience, things that people are probably not going to experience and help them build a security in themselves and, over time, prepare them for when they deviate from that.
I tell people, like, especially if you have loved ones still living, depression is never going to just leave. The concept...there's no cure. It's being able to be prepared for when things happen in life versus feel completely unprepared. I just came out of a season of grief of, you know, I walked away from a relationship, as well as then trying to still maintain my business, still trying to maintain my clients and those relationships, let alone the relationship with myself, and then put my cat down, you know, like, you know, he was a child. I had him for 14 years. So, like, life is going to continuously happen. So, I'm not trying to figure it all out, but I'm trying to get people back to a point where they can understand how to find security for themselves.
Since mental health has been such a taboo topic for a long time, there is quite a bit of backlog, and that's what we're seeing. I don't know what it looks like in y'alls countries, but here in America, there is this rush of people. I need a therapist. I need to go to therapy. And we're at a shortage. Therapists can't necessarily help all, like, at once. And we also have to maintain our own mental health, or we're not going to be very helpful to people.
But really, it comes down to how you build that security with yourself and know and not anticipate, but be prepared for when there's something else that happens that disturbs your own peace. Because if you have an understanding of what peace looks like for you, and you can't necessarily control it, but you can influence it, and facilitate it in your life, then you have a stronger foundation to be able to endure, you know, potential loss of a loved one, hopefully, no time soon for anyone here, or out, or listening, but it's just the reality.
And that's part of, you know, my story of, I experienced a lot of loss from a young age, and it worked against me for a long time because I had no idea how to process and regulate energy and emotion in my body. And so, what it looked like was me holding on to repressing anger, not having a relationship with the natural emotions that we can't get rid of. You can't get rid of emotions. I wish I could just, you know, vomit them out and just be done with it and be like, okay, cool. We can all be stable. That's just not...that's not going to happen. I think that also is what makes us, you know, a great species and building, you know, great things in this world is emotion.
Tech was built off of passion and emotion. Did it cause some disarray and probably hurt some people in the process? Yeah. But I think we can reduce that from happening if people understand emotional intelligence and not just work, work, work, work, work. It's a new age coming to that. And I've, hopefully, been working on myself enough to be able to sustain helping people understand and shifting over to that new type of perspective of we can't do things the way that we've been doing them. We just can't. It's not sustainable. The human species will suffer from it and the earth will as well.
SAMI: Yeah, thank you so much, and just for bringing that level of transparency and honesty. It resonates with myself, and I'm sure it will help so many other people who are listening. We could talk to you for hours. I mean, there is so much. And some things we just did not have time to get into. But thank you so much for the time that you've given us. And it's been really insightful to look at AI and tech that we work with as consultants at thoughtbot on a daily basis from this perspective and look at it from this angle. If people want to get a hold of you, where would be the best place?
SARA: Finding my website is a big thing. That's just, you know, kind of the portal. So, that's sarawilderlcsw.com. Sara with an out an H. And then, also, venturing into this tech world, I have an app interface now that I have put together to kind of be a centerpiece for mental health resources, not only just, like, hotlines. That information is on my website as well.
But if you want to start doing your own work little by little, you know, having a centralized spot as well as not too much information. There's plenty of stuff you can Google about mental health. But this is vetted by me and organized to a point where they can, you know, one worksheet can make a difference, where you're just reflecting and taking, you know, 10-15 minutes to read through it and see how you can apply it in your life. It's called Power in Perspective.
SAMI: That's great. Definitely, I recommend go and check it out and check out Sara on her website. And if you can get down to that conference, that is, again, North Carolina called CreativeVerse, and you'll have the opportunity to hear Sara in person as well as Fatima from thoughtbot who's also presenting.
If you learned nothing else from today, then just remember: fear has an acronym for F everything and run. I guess that's my big takeaway. You also got a chance to hear about my gaming addiction. No one tell my parents.
And you can find notes and a complete transcript for this episode at giantrobots.fm. If you have questions or comments, you can email us at hosts@giantrobots.fm. I always leave you the same challenge, and that challenge is to subscribe. We've got some great guests lined up, and you'll hear about it first if you subscribe. And feel free to leave any comments on Spotify or Apple Podcasts. We do check them all, and they're really helpful.
This podcast is brought to you by thoughtbot and produced and edited by Mandy Moore. Check her out at mandymoore.tech. Thanks for listening. See ya.
AD:
Did you know thoughtbot has a referral program? If you introduce us to someone looking for a design or development partner, we will compensate you if they decide to work with us.
More info on our website at: tbot.io/referral. Or you can email us at: referrals@thoughtbot.com with any questions.
Sponsored By:
- thoughtbot: Are you an entrepreneur or start-up founder looking to gain confidence in the way forward for your idea? At thoughtbot, we know you’re tight on time and investment, which is why we’ve created targeted 1-hour remote workshops to help you develop a concrete plan for your product’s next steps. Over four interactive sessions, we work with you on research, product design sprint, critical path, and presentation prep so that you and your team are better equipped with the skills and knowledge for success. Find out how we can help you move the needle at: tbot.io/entrepreneurs
559 قسمت
Alle episoder
×به Player FM خوش آمدید!
Player FM در سراسر وب را برای یافتن پادکست های با کیفیت اسکن می کند تا همین الان لذت ببرید. این بهترین برنامه ی پادکست است که در اندروید، آیفون و وب کار می کند. ثبت نام کنید تا اشتراک های شما در بین دستگاه های مختلف همگام سازی شود.