Save the traces! ๐ณ
Why?
I have multiple personal and work accounts across many AI services and tools. ChatGPT knows about my NAS and home office setup. Claude helps me on various projects in 5 different computers. I have explored half-done side projects and research ideas with Cursor, and some implementations with Codex. I have asked Pi, Hugging Chat and ChatGPT about running ๐โโ๏ธ.
Two years ago these were one-off, disposable questions. I realize they are now a growing corpus of scattered and fragmented knowledge, about topics I care about. I have already found myself in a situation where I remember having bounced ideas about off Cursor, to realize this happened on an iMac I already decommissioned ๐คฆโโ๏ธ.
No matter how advanced your "memory" solutions are, conversation traces are the new "file" abstraction. I want to preserve them all. I'm not fully sure how I will end up using them (some ideas later), but I do know I don't want to lose them any more.
How?
I started with something really simple:
$ for f in .claude .codex .cursor .pi; do uvx hf sync ~/${f} hf://buckets/pcuenq/traces/$(hostname -s)/${f}; done
This syncs traces and additional metadata to a private Hugging Face bucket. Ask your agent to set it up as a crontab for you, and to make sure it includes the services you use. I haven't exported my online conversations yet, that will be the next step ๐ช.
If you want fancier stuff, you can explore other ideas:
- Export to a dataset instead. This allows you to run queries against the data, and to see cool visualizations like this one (not one of mine, this is @mishig's ๐). I'm pretty sure browsing, replaying and other visualizations will come to buckets at some point too.
- Use
hf mountso that agents write to the bucket directly. Periodic sync is enough for me for now.
What for?
- Avoid vendor lock-in.
- Continue a project on a different computer.
- Ask your agent to summarize or consolidate various conversations.
- Extract conclusions from repeated tasks and create a Skill.
- Donate your traces so the community can train better open models.
- Reminisce with your grandchildren and their robot friends about how you steered LLMs to your liking in the good old days ๐ต๐ด๐ค