What we know about you. What we don't.

In one paragraph:

No account. No tracking. Conversations live encrypted on your device and auto-delete after 90 days. We can never read them. The AI runs on Anthropic's Claude — your message goes through their servers to generate a reply, then is deleted within 30 days and never trains the model.

The technical architecture below exists so that the promises above are not promises — they are what we are structurally unable to break.

You don't need an account

You don't need to give us your name, email, or any identifier to talk to Stay. Open the page, type. That's it.

Where your conversations live

By default, your conversations live only on your device, encrypted, and delete themselves after 90 days. Nothing is sent to our servers for storage.

You can change this in settings (coming soon). Choose to keep nothing, keep them longer, or back them up across devices with a recovery phrase that only you hold.

What we can never see

We do not have a server-side database of your conversations. Even if we wanted to read what you wrote, we structurally cannot. Even if a court ordered us to disclose your data, we have nothing to give them.

What goes through Anthropic

The AI you're talking to runs on Anthropic's Claude. Your message is sent to Anthropic's servers in order to generate a response. This is the only path that exists for you to talk to Stay.

Anthropic's policy: they do not train their models on your conversation, and they delete API request logs within 30 days (except where they suspect abuse). Encryption in transit (TLS).

We do not store the API request or response on our side. We never see the content.

What you can delete

Anything, any time, with one tap. No verification, no second-guessing.

If you share a device

Choose "don't save anything" mode (coming in settings). Use the quick-exit button in the top right of any page — it takes you to a neutral website and clears your screen.

What we don't track

What happens if a court asks

We will comply with valid legal process. By design, what we can produce is minimal: there is no encrypted blob of your conversation on our servers (unless you've explicitly enabled cloud backup, and even then, the blob requires your recovery phrase to decrypt).

We will never weaken our encryption or add backdoors. If pressured to do so, we would shut down the affected service before complying.

Will this ever change?

If we ever change anything that affects how your conversations are stored or who can see them, we will tell you before the change takes effect, in plain English, on this page.


Trust is the precondition for what we do. We design for the version of trust that doesn't depend on us being good people.