Fair-Use of Feelings: How We Design AI That Remembers Us Responsibly
A few nights ago I read an article about AI systems that can detect emotion in a person’s voice; not just the words, but the tremor behind them. It could tell when someone was about to cry, when they were masking irritation with laughter, when silence meant more than speech.
I remember closing my laptop and thinking, if machines can hear pain and joy, who is protecting the memory of how we feel?
That question became the seed of my new framework, Fair-Use of Feelings.
We are entering an age where algorithms do more than analyse our behaviour. They are beginning to map our emotional fingerprints; the subtle ways we comfort, apologise, flirt, hesitate. AI is learning the architecture of our inner lives. That’s powerful, but also intimate. When a machine starts to “remember” us at that level, the line between empathy and intrusion becomes paper-thin.
Fair-Use of Feelings argues that emotional data should be treated with the same protection and respect as genetic data. Our interior lives, our humour, grief, and affection, should never be mined, traded, or manipulated for profit.
The framework introduces seven principles for what I call AI-assisted empathy: systems that help people meet through resonance rather than performance. Imagine a platform that starts not with photos, but with conversation. One that listens for rhythm, tone, curiosity. One that helps people find compatibility through emotional honesty, not algorithms optimised for swipes.
AI can help us know ourselves more deeply, but only if we design it with transparency, consent, and cultural awareness. Otherwise, we risk turning the language of feeling into just another dataset.
We’re standing at the threshold of what I think of as an intimacy economy; a world where emotional information is the new currency. Without ethical guardrails, this economy will reward manipulation. With care, it could cultivate empathy.
That’s why I wrote Fair-Use of Feelings: How We Design AI That Remembers Us Responsibly. to offer a moral blueprint before emotional AI becomes another extractive industry. It’s a call for technologists, artists, and ethicists to collaborate on systems that honour what is most human in us.
AI doesn’t need to replace our capacity for connection. It can extend it, if we remember that memory itself is sacred.
The full framework, Fair-Use of Feelings: An Ethical Framework for AI-Mediated Human Connection, is now available on Zenodo (link in comments).
Let’s make sure the next generation of intelligent systems doesn’t just know about us, but helps us know each other more deeply
https://zenodo.org/records/17588882
#AIethics #DigitalIntimacy #EmotionalAI #MemoryTech #ResponsibleInnovation #AIandCulture #LyndonAmoah
