Introduction
Author Mark Manson, known for "The Subtle Art of Not Giving a F*ck," has developed an AI self-help application called Purpose. Manson has observed the potential of AI in the self-help sector, noting that individuals are increasingly utilizing chatbots for personal guidance.
Context: AI and Self-Help
Recent surveys indicate that 8% of U.S. adults who use chatbots utilize them for discussing personal issues. Psychology experts suggest AI can provide an avenue for individuals lacking close social connections, particularly those in marginalized communities where discussing mental health may be culturally sensitive.
Research on the mental health benefits of AI chatbots has yielded mixed results. While some studies show positive outcomes, others report negative incidents. Experts note that despite advancements in making chatbots helpful, they may lack the subtlety required for comprehensive therapeutic interaction, potentially isolating users who require firm guidance.
About the Purpose App
Purpose is positioned as a personalized, enhanced version of a self-help book, explicitly not a replacement for professional therapy. Manson and co-founder Raj Singh aim for the app to provide a "self-reflecting, almost healing kind of experience" at a significantly reduced cost compared to traditional therapy or coaching. The developers intend for Purpose to occupy the space between expensive therapy and independent journaling.
Purpose incorporates specific user memory and robust data privacy measures, distinguishing it from general chatbots. Manson states that the app is designed for depth, challenging users rather than simply affirming them, addressing concerns about reinforcing unhelpful behaviors.
Reviewer's Experience
The reviewer, with limited prior engagement in self-help or AI, tested the Purpose app for one week. The app generated a user profile based on responses to questions about values, life satisfaction, and personal experiences, subsequently creating a descriptive user identity.
Identified Limitations of Purpose
- Repetitiveness: The app frequently repeated the user's statements, sometimes accompanied by expletives or emphatic phrases.
- Addictive Design: The app's structure, involving constant questioning and affirmations, was perceived as designed to encourage continuous engagement.
- Therapy-Speak: Purpose frequently used clinical terminology.
- Overdramatic Language: The app often employed exaggerated phrases to describe user insights.
- Questionable Moral Compass: The app suggested donating to a disliked cause as a potential consequence for failing to meet a goal, which the reviewer rejected as a motivator.
Identified Benefits of Purpose
- Challenges Users: The app engaged in persistent questioning and pushed back on the reviewer's statements and suggestions, drawing on previous conversational patterns.
- Self-Correction: Purpose demonstrated the ability to recall prior interactions and adjust its approach based on new user input, such as canceling a scheduled check-in after the reviewer expressed discomfort with the term "accountability."
- Non-Human Interaction: The reviewer noted a greater willingness to be unfiltered and assertive with the AI compared to a human therapist, facilitating breakthrough insights.
- Action-Oriented Approach: While the app did not uncover new psychological obstacles, it assisted the reviewer in developing actionable strategies for motivation.
Conclusion
Experts advise approaching AI self-help tools with a critical perspective, advocating for users to challenge the AI and expect it to do the same. The reviewer concluded that, when used with this mindset, Purpose can be a valuable tool for generating initial steps toward problem resolution.