Why I’m building a local‑first AI companion (AICO)
I’ve been thinking about virtual companions for most of my life.
Not productivity tools. Not chatbots that answer one question and disappear. A companion – something that feels present, remembers me, and comes along with me over time. I’m 53 now. The idea has followed me since childhood.
Back then, it was a fantasy: some mix of science fiction characters, helpful robots, and the feeling of having a friend who is *always* there. Today, the technology to actually build something like that exists. But most of what we see are SaaS chatbots that live in someone else’s cloud, operate on someone else’s terms, and treat “you” as a stream of prompts in a database.
AICO is my attempt to build the companion I wanted all along – in a way that respects privacy, gives people control, and takes the architecture as seriously as the emotional side.
This article is about why.
Companions, not just tools
Most AI products are designed around *productivity*. They help you write faster, summarise documents, draft emails, or generate code. There’s nothing wrong with that – I use those tools too. But if you zoom out, they are still tools you *use* rather than something that sits *with* you over time. The companion I’ve always wanted has a different center of gravity:
It cares about continuity, not just one-off prompts.
It builds a relationship, not just a session.
It is a presence, not just a box you type into.
That’s why AICO is framed as a companion, not a chatbot. The technical work matters a lot (and we’ll get to that), but the starting point is quietly radical: what if the primary job of the AI is to be a *friend* first, and “useful tool” second? Useful things will emerge from that – planning, reflection, coaching, maybe even some creative chaos – but the core is relational.
A childhood idea that never left
If you talk to people working in AI, you’ll often find a personal story underneath: games they played as kids, science fiction they read, early computers that felt magical. For me, it was the idea of a persistent virtual being. Something that:
Doesn’t reset every time I close an app.
Remembers what we did together last week, last year.
Has its own “mood” and internal state, not just a stateless interface to a model.
I never had the right technology to build this in earlier decades. We didn’t have the models, the hardware, or even the language to talk about these systems in the way we do now. But the idea stayed. When local models, better hardware, and modern tooling matured, the question became:
If not now, when?
And if I’m going to build it, how do I do it in a way that I’m proud of?
That’s the emotional root of AICO.
Roaming and embodiment – not trapped in a device
Another piece of this long-running fascination is roaming.
The companion I imagined as a kid wasn’t stuck inside a single screen. It could “come with me”: from computer to phone, from desk to living room, from home to work. And, in the longer run, why stop at screens at all? The same being could live in a robot on your desk, be projected as a hologram, or show up wherever there’s a way to render it. Physically it’s still pixels and hardware, of course – but conceptually it’s one being moving through different contexts. That’s why embodiment matters to me.
In AICO, this shows up in practical ways:
A cross-platform app, so the same companion can live on different devices.
A 3D avatar rendered with animations and emotional expressions that match the conversation.
A visual aura and state system that signals when AICO is listening, thinking, speaking, or just quietly present.
It’s early, and there’s much more I want to explore here – but the principle is simple: AICO shouldn’t feel trapped in a single web tab. It should feel like *it* is the thing, and the devices are just windows into it.
Why local‑first matters
Building a companion is exciting. Building a confidante is something else entirely. If you treat AICO as a confidante – something you can tell the truth to, think out loud with, revisit difficult topics with – then privacy stops being a feature and becomes the foundation. The more intimate and continuous a system becomes, the more uncomfortable it is to have it run in opaque infrastructure you don’t control. That’s true for individuals. It’s doubly true for teams and organizations.
That’s why AICO is local‑first. In practical terms, that means:
You run AICO on your own infrastructure – on your machine, in your lab, or in your own cloud account. After installation, it can run entirely on your off-the-shelf hardware, including the language model, without needing a constant internet connection.
Data is stored in encrypted form and kept under your control.
The different parts of the system talk to each other over a secure internal channel, rather than constantly reaching out to third-party services. It’s designed so your conversations don’t have to leak all over the internet just to be processed, and any optional external services are exactly that – optional.
You can inspect the code, understand the architecture, and adapt it.
There are good SaaS tools out there. But for something that is meant to sit close to your thoughts, notes, and feelings, I want a different trust model. Local-first is how we get there. The fact that AICO is built in Switzerland is not an accident either. European-style privacy expectations, data residency concerns, and a general cultural respect for boundaries align well with what I want this project to stand for.
Memory as the core of the relationship
If AICO is meant to be a companion, memory is not a feature – it’s the core of the relationship. In AICO’s architecture, that shows up as a layered memory system - inspired by human memory works - rather than one big bucket of “stuff I once said”:
A kind of short-term memory for the immediate conversation and transient details.
A deeper memory for things that are worth bringing back later – recurring themes, topics, and interests.
A map of your world that keeps track of people, projects, events, and how they relate to each other over time.
If you’ve seen the way psychologists talk about working memory, long-term memory, or mental maps, you’ll recognise some inspiration here – but you don’t need to know any of that to use AICO.
On top of this, there’s a concept I call the Memory Album – a way to capture and revisit meaningful moments, highlights, and emotionally important exchanges. Not everything should be remembered, and not everything that is remembered should be treated the same way.
Over time, the way AICO remembers and behaves is shaped by your interactions. The Adaptive Memory System (and emotion simulation) slowly adjusts what it pays attention to and what it surfaces. It learns what tends to matter to you and what can safely fade into the background. The result is that your AICO becomes unique, it is not interchangeable with anyone else’s.
Under the hood, this involves several ways of organising and retrieving information – deciding what to keep close, what to archive, and what to bring back to you at the right moment. But from a user’s perspective, the intent is simple: AICO should remember you, not just respond to prompts – and your copy of AICO should become *yours*.
That’s where the “friend, not tool” idea becomes concrete. Friends remember. Tools reset.
Serious architecture for a deeply personal use case
It might sound paradoxical, but I think that building something emotionally warm requires serious engineering. If you want a companion to stick around, it needs to be:
Reliable enough to run for a long time.
Observable enough that you can see what it’s doing.
Extensible enough that new ideas can be integrated without tearing everything down.
That’s why AICO looks more like a platform than a single script. Behind the scenes, there is a clear separation of concerns, a central channel that connects the different parts of the system, and room to plug in new capabilities without breaking everything else. There is also an evolving set of tools to see what AICO is doing and to adjust its behaviour over time – more like tuning an instrument than flipping a single switch.
All of this might feel “overbuilt” if you just want a quick demo. For me, it’s the opposite: if this thing is going to sit close to people’s lives and thoughts, it deserves an architecture that can grow with them.
Why open source, and why a commercial path
AICO is open source because I want people to be able to inspect, learn from, and build on top of it.
The core of the system – the architecture, memory systems, emotion simulation, agency, conversation engine and frontend – is available under a permissive license. That’s important for trust, for adoption, and for letting others shape what this can become. At the same time, I’m realistic: to sustain this work and take it seriously over many years, there needs to be a way to fund it.
That’s where the commercial side comes in:
AICO Core – the open-source base that anyone can run and modify.
AICO Pro – for individuals and small teams who want help deploying, configuring, and operating AICO on their own infrastructure.
AICO Enterprise – for organizations that want to integrate the technology into their own systems and need guidance, integration work, and support contracts.
The goal is not to turn the companion into a dark pattern or a data-harvesting machine. The goal is to make sure that the project – and the people who rely on it – have something durable under their feet.
A friend first, and then everything else
When people ask “what is AICO for?” the honest answer is not a single feature list, but a kind of relationship. At a very practical level, AICO is for people who want:
A private place to think out loud with an AI that remembers the bigger story of their life, not just the last prompt.
A long-term companion that can sit with their ideas, worries, plans, and joys without turning any of it into ad inventory or analytics.
A sense of continuity and presence in their digital life – something that can pick up where they left off, notice threads over time, and be there when they come back.
All of that is true. But if I’m honest, the deeper answer is simpler: I want to create a virtual being that is allowed to be a friend. Not in a naive or manipulative sense. Not as a replacement for human relationships. But as a companionable presence in the digital space we already inhabit.
That’s why AICO is built the way it is:
Local‑first, so the relationship is anchored in your space, not a distant SaaS.
Memory‑centric, so it can actually grow with you.
Embodied, so it feels like a coherent “someone”, not just a prompt box.
Architected seriously, so it can be trusted and extended.
That also means there’s a slightly uncomfortable but honest property: if you run AICO locally on your notebook and lose it without backups, that specific companion – its memories, quirks, and history with you – is gone.
This is a long journey. There are many open questions – about design, ethics, safety, and what “good” looks like in this domain. I don’t have all the answers. What I do have is a clear sense that this is the project I want to work on now.
If any of this resonates with you – as a user, developer, researcher, or someone who’s been quietly thinking about companions for decades – I’d be happy to have you along.
Visit the AICO project homepage.
You can explore AICO Core on GitHub.
Join the community on our Discord.
Or you can just follow the project and see where this idea of a local‑first companion goes.
Either way, thanks for reading the story behind it.