My older sister introduced me to the Finch app 537 days ago (It's like a personalised wellbeing tamagotchi that you can connect with friends using - incase you haven't heard of it) and it's been a really positive experience for me to remember the habits I am trying to build and stick to. P.S. I only know its 537 days because it counts the "streak" if you use it everyday - and I've really surprised myself by managing to stick to using it daily for that long.
Goblin.Tools I came across much earlier in my journey as a sole-trader researching websites and apps to help support neurodivergent youths that I work with or provide services to, and to my surprise I discovered it was (personally) immensely helpful for reducing overwhelm and lowering the difficulty of tasks that require more thinking than my brain is sometimes capable of managing.
But the controversial one here I'm really interested in is the use of AI language learning models (like ChatGPT).
The vast majority of people I have been brave enough to ask or speak to about this are staunchly very against using AI for anything related to mental health as it is known to be a poor substitute for speaking to a real person (much less a professional), and beyond that it's well known that over-reliance on such tech weakens ones natural "thinking muscles", and if thats not bad enough - the ethical considerations and very concerning privacy issues are valid things to concern oneself with.
But, with that said - it's now been around long enough to become almost normalised for "neurotypicals" to use AI variants as "smart assistants" to improve productivity, polish draft thoughts or ideas, and/or in general to compensate for all kinds of personal areas of weakness or limitation (helping to think, write, calculate, plan, learn, and create things more quickly / easily).
I feel like use of AI by neurodivergent minds for the exact same purposes (for example, someone with CPTSD trying to find easier ways to function day-to-day or to compensate for their symptoms) inherently attracts a lot more judgement than the exact same case scenario with someone who doesn't identify with a past and/or present experience or experiences of trauma / complex trauma.
Very curious as to what this community thinks of this topic as the cross over between AI and mental health is a very recent development (in the greater scheme of things), its definitely a very broad and very polarising topic...
Personally - I can't make up my mind whether I am more for, or against such technologies/tools being used for this purpose.
I see sooo many risks and downsides, but also so many positives when used thoughtfully and with discernment/caution. On the one hand - these tools can be sooo dangerous, but on the other hand - they can be a real game changer for some of us too.
So yeah, curious if anyone on here uses any of the tools/apps I've mentioned (or others I've maybe not heard of?) to manage and/or cope with your own unique experience, curious as to your opinions (whether you do or don't use tech like this) and if you do or have used tech along these lines - I'd love to learn more about what you found helpful (or not), what you like (or dont like) and where you draw the line / boundaries to keep yourself (and others) safe when using AI or phone apps.
Thanks