事件溯源作为开发者的创意工具
事件溯源是一种架构,它将系统中的每一次变化都记录为不可变的事件,而不仅仅是存储最新的状态。这样一来,你不仅知道数据当前的样子,还能保留完整的历史记录,了解数据是如何演变到现在的。在一个简单的 CRUD 应用中,这意味着每一个被删除、更新或创建的条目都会存储在事件源中。这样,当你重放事件时,就可以重建应用在任何特定时间的状态。
大多数开发者将事件溯源视为一种技术安全网:
- 从故障中恢复
- 重建损坏的读取模型
- 审计能力
在面对模式变化时能够轻松应对。
这确实很合理,重放事件流常常让人感到压力重重。出现了故障,你需要修复它,同时希望一切能够顺利重建。
如果重放事件历史不仅仅是为了应急呢?如果它成为构建系统的日常一部分呢?
与其将重放视为恢复机制,不如将其视为开发工具——用来演进数据模型、改善逻辑,并随着时间的推移塑造数据的新视角。更令人兴奋的是,这意味着每当需求变化时,你可以从事件历史中衍生出全新的模式。
你的数据库不再是唯一的真实来源,而是回归其本来的目的:成为数据的快速、便捷的缓存,而不是锁定所有逻辑和假设的地方。
拥有完整的事件历史后,你可以自由地尝试新的读取模型,毫无顾虑地调整数据结构,并将数据塑造得完全符合新的用途——例如丰富字段、填补缺失值或为 AI 消费构建专用模型。重放不再是修复故障,而是持续改进你所构建的内容。
这带来了重大影响——尤其是在 AI 和 MCP 服务器方面。
大多数应用数据库并不是为自然语言查询或 AI 驱动的洞察而构建的。它们的模式是为事务设计的,而不是为了理解。数据分散在规范化的表中,关系和假设深深嵌入结构中。
但当你将事件历史视为真实来源时,你可以将事件重放到专门构建的读取模型中,特别为 AI 消费而设计。
需要高效语义搜索的扁平化、非规范化表?没问题。
想要创建一个以用户为中心的视图,带有预先连接的上下文以便更好地提示?轻而易举。
你不再受限于应用的模式——你可以将数据塑造得完全符合 AI 的消费需求。
而这里变得更加有趣的是:AI 本身可以帮助你探索数据历史,发现有价值的信息。
与其猜测哪些字段应该包含,不如使用 AI 来审视原始事件,发现空白、识别模式,并指导你设计更智能的读取模型。这是一个反馈循环:你的 AI 不仅仅是查询数据——它帮助你塑造数据。
因此,与其强迫 AI 与事务表作斗争,不如为其提供干净、专用的模型,优化以便于发现、推理和洞察。
最棒的是?你可以不断迭代。随着 AI 用例的发展,你只需调整流程并重放事件以重塑模型——无需迁移、无需填补、无需重新设计。
查看原文
Event sourcing is an architecture where you capture every change in your system as an immutable event, rather than just storing the latest state. Instead of only knowing what your data looks like now, you keep a full history of how it got there. In a simple crud app that would mean that every deleted, updated, and created entry is stored in your event source, that way when you replay your events you can recreate the state that the application was in at any given time.<p>Most developers see event sourcing as a kind of technical safety net:
- Recovering from failures
- Rebuilding corrupted read models
- Auditability<p>Surviving schema changes without too much pain<p>And fair enough, replaying your event stream often feels like a stressful situation.
Something broke, you need to fix it, and you’re crossing your fingers hoping everything rebuilds cleanly.<p>What if replaying your event history wasn’t just for emergencies?
What if it was a normal, everyday part of building your system?<p>Instead of treating replay as a recovery mechanism, you treat it as a development tool — something you use to evolve your data models, improve your logic, and shape new views of your data over time. More excitingly, it means you can derive entirely new schemas from your event history whenever your needs change.<p>Your database stops being the single source of truth and instead becomes what it was always meant to be: a fast, convenient cache for your data, not the place where all your logic and assumptions are locked in.<p>With a full event history, you’re free to experiment with new read models, adapt your data structures without fear, and shape your data exactly to fit new purposes — like enriching fields, backfilling values, or building dedicated models for AI consumption. Replay becomes not about fixing what broke, but about continuously improving what you’ve built.<p>And this has big implications — especially when it comes to AI and MCP Servers.<p>Most application databases aren’t built for natural language querying or AI-powered insights. Their schemas are designed for transactions, not for understanding. Data is spread across normalized tables, with relationships and assumptions baked deeply into the structure.<p>But when you treat your event history as the source of truth, you can replay your events into purpose-built read models, specifically structured for AI consumption.<p>Need flat, denormalized tables for efficient semantic search? Done.
Want to create a user-centric view with pre-joined context for better prompts? Easy.
You’re no longer limited by your application’s schema — you shape your data to fit exactly how your AI needs to consume it.<p>And here’s where it gets really interesting: AI itself can help you explore your data history and discover what’s valuable.<p>Instead of guessing which fields to include, you can use AI to interrogate your raw events, spot gaps, surface patterns, and guide you in designing smarter read models. It’s a feedback loop: your AI doesn’t just query your data — it helps you shape it.<p>So instead of forcing your AI to wrestle with your transactional tables, you give it clean, dedicated models optimized for discovery, reasoning, and insight.<p>And the best part? You can keep iterating. As your AI use cases evolve, you simply adjust your flows and replay your events to reshape your models — no migrations, no backfills, no re-engineering.