AMT Lab @ CMU

View Original

#SelfCare: A Nontraditional App for Managing Stress

Cover photo: Screenshot of #SelfCare app from TRU LUV.

Introduction

#SelfCare is an application designed with a gamification approach to self-care. It was named one of Apple’s best self-care apps of 2018, when self-care was also named one of Apple’s top trends. Unlike most apps, #SelfCare has no advertisements, does not ask for a subscription when signing up, and sends out few notifications, which can also be turned off easily. It is difficult to obtain the number of downloads from the App Store, but as of 2019, it had more than two million downloads. This review looks at the application in a post-pandemic context wherein being alone is the new norm, at least for the time being.

The Basics

The app helps build rituals for self-care by transmuting emotional responses and elongating the connection time with self with micro-tasks that allow the body to find balance and the mind to distract itself from the source of a troubling stimuli—a bad day at work, family or relationship troubles, an existential crisis, or an exciting or overwhelming incident. Using gamification, the app allows users to accomplish common self-care tasks in their digital rooms.

The app becomes a companion in itself since, in the middle of a rough day at work, one could sneak out for some time to be with this digital self that seems relaxed and less overwhelmed, in effect becoming a totem. The AI comes into play with the transmutation of emotional responses, an area of increasing interest that creates a broad emotive channel, interceding the immediate responses.  

The app was developed by Brie Code, who has decades of experience working in the more intensive and active gaming industry. She employs her experience with systems thinking, systems design, and AI architecture in the interactive space. Her personal experience with marginalization and post-traumatic growth informs the development of this app, positioned to help others through healing. 

The Premise

The developers’ intention was to introduce an AI companion—a friend, not a boss or an assistant. The friend has not shown up in the app yet, but considering that the app is now more than three years old, I am guessing that the friend is still being developed. Latest updates on the website reveal the plans for deepened rituals for cultivating deeper resilience, as well as an AI companion who may help you know yourself and your talents more. A simplistic representation would be as follows.

Figure 1: Model of emotional responses cultivated by the app. Source: Author.

Transmuting emotional responses to boredom, anxiety, PTSD, and related mental states to a relaxed state requires a ton of work and a layering of rituals along with nudges to divert the mind from the triggers. The app does achieve that quite interestingly. At first, the room feels very small and the desire is to escape the room, but in a few minutes, you become aware of this body (that is supposedly yours) unable to escape the room. So for now, people who downloaded and integrated this app into their day have been alone in the room for years, trying to heal and waiting for a friend.

Inside the app

The app puts the user in a room all by themselves, on a bed, under a duvet, with just an arm extending out. Most textures in the room (including the duvet covers) can be customized with an existing set of icons and patterns. The room is littered to begin with every time I open the app with things that seem very personal: clothes, shoes, a journal, framed photos, a cat, and trinkets of all kinds. The user can engage with anything they want, or nothing at all. The app does not provoke an engagement.

Within a few seconds, the room evokes a sense of mess and I am drawn to clear the clutter, starting with the laundry (which is a mini game within the game). One can also play with the cat or journal something. I also pulled out tarot cards. While every activity has an element of gamification and reward, at no point does it become overwhelming into a sense of victory or defeat. The rituals are personal—things that bring a sense of organization into a disorganized, anxious life. There is something for everyone, including breathing rituals, that help make every moment within the app very interesting.

Reviews & Ratings

The app receives significant good reviews on the App Store. People find resonance with a multitude of things that help them overcome overwhelming feelings and make peace with the space they are in. The app also received significant critical reviews where users were sometimes offended or triggered by the contents of the room. Some found it difficult to delete once installed. Some were also unable to locate the meaning of everything and raised concerns about the lack of freedom to leave the room, almost feeling trapped.

User Interface

Designed as an antithesis to a traditional game, the app is difficult to classify. It has a structure that seems to mimic the relationship we already have with our phones to extend time we spend away from triggering sources on social media or the news. The colors are muted, and the room seems filled with a late morning sunshine filtering through blinds.

Figure 2: Screenshots from within the #SelfCare app. Source: Author.

Privacy & User Rights

The app can be obtained from both the App Store for iPhones and the Google Play store for Android phones. The terms of service page of TRU LUV’s, the developer’s, website reveals that the app is not intended for children under the age of 16. It also reveals that user data (such as health, heart rate, and data from onboard device sensors) is used by the app to refine the experience. It, however, does not specify the manner in which this data is used in its algorithm. They prohibit anyone from reverse engineering or hacking their app or service, which also means that no one is legally allowed to probe or test vulnerabilities. While this could be relevant for system integrity, it does raise queries for ethical hackers who may want to probe and further establish the services’ sercurity.

TRU LUV carefully establishes its liability to exclude any loss of data, revenue, profits, business opportunities, time, or litigation expenses. In doing so, the app positions itself for users who understand these technicalities. However, for a service that seeks to enhance trust (across their tweets, in-app prompts, and in-app purchases) it does not seem to place much trust in the community of its users.

Social Features

The organization maintains asocial presence on Instagram, Facebook, and Twitter, and in all these platforms it maintains a consistent voice, with mostly calming and affirming messages. The activity is sporadic and branches out into micro-conversations. I feel the intention to provide people an isolated, comfortable self-care experience could fundamentally be in the way of a community that could be the antithesis of this experience.

Accessibility

Since the app requires a smartphone for use, it can be accessed by anyone who has access to one. The app does not communicate any device or platform requirements, which could be frustrating for people who have slightly older phones or are outside of the Apple and Google ecosystem. Within the app, I did not find any accessibility issues. However, as pointed by some reviews on the App Store, the items in view could either calm someone or trigger them based on what associations individual users have with these items.

Final Thoughts

#SelfCare does not feature a closed community either in app or outside, so while it has this captive audience, it does not really aim to form a communication network. Its growth of followers has been stagnant for some time, but it looks like the team is regrouping and bringing in a more refined experience soon. I am intrigued by the transmutation capability of AI: the ability to insert inputs that could change the outcome of any process or emotional phase is perhaps a significant frontier for AI systems to achieve.

While TRU LUV, the organization behind the app, does preface itself as “a network of dreamers, witches, and technologists who believe in caring technology,” it is hard to have that argument while being less than transparent about how the data is collected, exactly what is it being used for, and how it impacts the larger world. Thus, the app reduces itself to fantasy, a cool game with no real intention to make an impact on mental health. Instead, it is an entertaining experience for some and a triggering nightmare for others.

Post pandemic, I would love it if the on-screen window blinds could be pulled up to reveal the outside, enough to know that the world outside still exists. Having spent a significant amount of time locked in our homes due to the pandemic, the window has suddenly become important, even more than the door perhaps. I am yet to be joined by the promised AI companion in the room, hopefully with stories from faraway places.

See this content in the original post

+ Additional Resources

Western Governors University. “Ethical Hacking And How It Fits With Cybersecurity.” Accessed March 24, 2021. https://www.wgu.edu/blog/ethical-hacking-how-fits-with-cybersecurity1908.html.

LUV, TRU. “Press Kit.” Accessed March 24, 2021. https://www.truluv.ai/press-kit/#assets.

LUV, TRU. “Privacy Policy.” Accessed March 24, 2021. https://www.truluv.ai/terms-of-service/.

LUV, TRU. “We’re an Emanant Organization.” Accessed March 24, 2021. https://www.truluv.ai/our-organization/our-ecosystem.

“‎#SelfCare - Ratings and Reviews.” Accessed March 24, 2021. https://apps.apple.com/us/app/selfcare/id1378384555.

Potamianos, Alex. Narayanan, Shri. “Why Emotion AI Is the Key to Mental Health Treatment | Transforming Data with Intelligence.” Accessed March 24, 2021. https://tdwi.org/articles/2020/04/07/adv-all-why-emotion-ai-key-to-mental-health-treatment.aspx.

Sikaneta, Pumulo. “AI Won’t Go Anywhere Unless It Has Empathy.” VentureBeat (blog). September 18, 2017. https://venturebeat.com/2017/09/18/ai-wont-go-anywhere-unless-it-has-empathy/.

Cook, Jonathan. “It Isn’t Emotional AI. It’s Psychopathic AI.” Medium. August 14, 2018. https://medium.com/@JonathanCCook/it-isnt-emotional-ai-it-s-psychopathic-ai-63a910282985.

Kleber, Sophie. “3 Ways AI Is Getting More Emotional.” Harvard Business Review, July 31, 2018. https://hbr.org/2018/07/3-ways-ai-is-getting-more-emotional.

“Emotion AI, Explained.” n.d. MIT Sloan. Accessed April 6, 2021. https://mitsloan.mit.edu/ideas-made-to-matter/emotion-ai-explained.