Common questions, honestly answered.
Is this therapy?
No. Stay is not therapy, and the AI is not a therapist. It does not diagnose, prescribe, or provide professional mental-health treatment. If what you're working through needs ongoing professional support, please see a real therapist. Stay can be useful before, between, or alongside therapy — but not instead of it.
How is this different from ChatGPT?
ChatGPT (and most general AI assistants) are designed to be maximally useful and engaging — to give you good answers and keep you coming back. Stay is the opposite. It runs on a more carefully designed system prompt focused specifically on emotional presence, communication, and crisis safety. It is built not to keep you. If anything, it tries to point you back toward the real people in your life.
Can I trust what the AI says?
Mostly, yes — and sometimes, no. The AI is a pattern-matcher trained on human writing about emotions, relationships, and healing. It can be wrong about your particular situation, and you should treat it like a thoughtful but imperfect friend. If something it says doesn't fit your reality, trust yourself first. We've designed it to challenge you when you're spiraling and to defer to you when the question is actually yours to answer.
Will my data be sold or used to train AI models?
No. We don't have a database of your conversations. They live encrypted on your device. The model that powers Stay (Anthropic's Claude) does not train on your conversation per their policy. Even if we wanted to read what you wrote, we structurally cannot. See privacy for the full architecture.
Why is this free? What's the catch?
There is no catch. Stay is free because the people who most need it can't pay for it. The plan to make it sustainable is to find institutional funding — foundations, mission-aligned sponsorships, possibly opt-in donations — never to charge people who came here for help.
Will the AI remember me when I come back?
Yes, on the same device. Your conversation is encrypted and stored locally for 90 days by default. When you return, you can pick up where you left off or start fresh. Cross-device sync (with a recovery phrase only you hold) is coming.
What if I'm in crisis right now?
If you're in immediate physical danger, call 911. If you're having thoughts of suicide or self-harm, please call or text 988 — they are real, trained humans available 24/7, free and confidential. Stay can be a bridge to those resources but is not a substitute for them.
Can I delete my conversations?
Yes. Anytime, with one button. Go to settings to delete a single conversation or every conversation we have on your device.
Why does the AI sometimes disagree with me?
Because a friend who only ever agrees with you is not actually being your friend. The system prompt explicitly instructs the AI to push back gently when you might be telling yourself a story that's not quite true. It will always be warm about it. If you ever feel it pushed back wrongly, trust yourself — the AI doesn't know your life.
Is Stay watching me or listening through the microphone?
No. Stay only sees what you type. It has no access to your microphone, camera, location, contacts, or anything outside the conversation you choose to have with it.
What if the AI says something harmful or wrong?
Please email hello@thestay.app and tell us what happened. We take this seriously — the design intentionally tries to prevent it, but no system is perfect. Your feedback is how we make it better.
Are conversations shared with anyone else?
No. Not with us. Not with researchers. Not with anyone. Your conversations exist between you and the AI on your device. The conversation passes through Anthropic's servers transiently to generate responses (this is unavoidable for any AI product), but it is not stored there past 30 days and is not used for training. We never see it.
Didn't find what you needed? hello@thestay.app