Or, maybe I don't. I'm not sure. It feels exciting, but sometimes really bad ideas feel that way for the first few hours. The idea is a new treatment for people with mental health problems, who can benefit from therapy. It would be kind of like a drug, but more like additional behavioral therapy outside the therapist's office.
The story would have to take place in a near-future, where augmented reality is becoming truly viable but not yet widespread. The protagonist would be offered a trial therapy for her invasive thoughts and self-destructive habits: maybe a chip or a brain implant, or maybe something as simple as an app. I imagine it would probably be something semi-intrusive, but it could be an add-on to the protagonist's pre-existing augmentations. (That could explain why she's a good candidate for the trial.)
The treatment would be a sort of helpful psychological buddy -- a voice, or visual entity, or both, possibly more -- probably responsive to what tends to work with the patient -- that monitors the environment and the patient's physiological and emotional state, and gives advice on how to respond in a healthy way.
It would be more effective, a better AI, than the first level of problems I could come up with as a writer. It wouldn't just try to make her happy all the time. It wouldn't, for example, undermine a grieving process to make her feel superficially better, but it would discourage unhealthy expressions of that grief, like drinking heavily or lashing out at other people.
It would remind her to drink water, to take a shower and brush her teeth, to do laundry -- but it would be more conscious than a to-do list of what she needs to hear in the moment. When she's perfectly motivated to get up on time and eat a good breakfast and wash up, it wouldn't nag her about it. And it wouldn't issue commands. It'd just remind her, stuff like "You'll feel better today if you take a shower," or "I know you're feeling down, so now's the time to take extra care of yourself, to help get through this rough patch."
This story would be hard to write, because every instinct I have when I think about how to add conflict is that the AI would screw things up, but that approach defeats the purpose of the story -- then it's just "Psychiatric technology is bad, mmkay?"
The best conflicts I've come up with so far today are that the device starts to influence her in ways that obviously make sense to program, but that conflict with her personal goals for treatment, and not just in an "I identify with my illness in a self-destructive way" kind of way; or, that the company goes live with the commercial version of the drug, and she notices herself being encouraged to make more questionable decisions that clearly benefit the pharmaceutical company and its connected companies. The voice might say "That hat looks really good on you, it's okay to treat yourself once in a while." And she might notice, later on, that the company that makes the hat is owned by the same umbrella company that makes the AI.
Or it could just be conflict with people who think that Augmented Reality therapy is inherently bad -- they don't have to be right. Maybe it could be about one or more toxic relationships disintegrating as she starts to notice how often the AI points out that her friend is making her feel shitty or saying hurtful things on purpose.
I can't write this story right now, but I'm probably going to eventually. In the meantime, if anybody wants to steal it, go for it. I've got no problem writing about stuff that other people also write about.