Emotional Memory for AI

The AI finally knows
how you feel.

Resonance reads the emotion behind your words, remembers who you are, and gives any AI the awareness to respond to how you actually feel — not just what you said.

irm https://install.resonance-layer.com/install-win.ps1 | iex
curl -fsSL https://install.resonance-layer.com/install-mac.sh | sh
curl -fsSL https://install.resonance-layer.com/install-linux.sh | sh
pip install resonance-layer

Named after Jody — she walks into a room and just knows.


What it does

Everything happens
invisibly.

You write. Resonance reads — not just the words, but the weight behind them. It detects emotion, tracks how you are doing over time, and tells the AI everything it needs to know before it responds. You do nothing differently. The conversation just feels different.

🔍

Reads underneath

Detects the exhaustion behind a short reply. The anxiety inside a polite question. The things you feel but would never think to say out loud.

🧠

Remembers you

Builds a living profile over time. Notices patterns. Learns that you process difficult things through humour, or that your energy drops in the evenings.

Works with any AI

Runs fully embedded. No external server. No setup. Three lines of code and any LLM gets emotional awareness — Claude, GPT, Mistral, anything.

🔬

Grounded in science

Eight psychology frameworks baked in — DBT, Window of Tolerance, PERMA, Wise Mind and more. Not guessing. Research.


"Text doesn't carry emotion. You know what you meant when you wrote it — but the AI only sees the words. Resonance changes that."
Resonance is not a therapist. It is a mirror.
For developers

Three lines.
Any LLM.

Add emotional awareness to any AI in minutes. Resonance handles detection, profiling, and context injection. You handle the conversation.

from resonance import Resonance

# Initialise once per user
r = Resonance(user_id="your-user-id")

# Process each message before your LLM sees it
context = r.process("I've been so anxious about this")

# Pass emotional context into your LLM
llm.chat(system=context.to_prompt(), message=message)