展示HN:Meru OS – 第一个主权人工智能栈(<2MB,CPU原生)
嗨,HN,我在过去几周里构建了一个名为Meru OS的实验性人工智能操作系统,基于一个理念:智能应该是可验证的,而不是概率性的。
大多数大型语言模型(LLM)都是“黑箱”。我想构建一个“玻璃箱”,使每个输出都能追溯到一个经过认证的来源。
架构:
内核:状态不是用向量表示,而是用整数表示。我们使用算术基本定理(唯一质因数分解)来编码概念。
例子:如果时间 = 3,空间 = 5,那么时空 = 15。这允许通过简单地对整数进行除法来实现可逆的“时间旅行调试”。
虚拟机(Bija):它不仅仅是运行周期;它运行“频率调制”。
内核本身是用特定的质频(赫兹)编码的。
执行实际上是指令与数据之间的“共振”状态,而不是传统的取指-解码-执行流水线。
数据(< 2MB):我们将整个印度宪法、印度刑法典和一个印欧词源词典压缩成一个<2MB的包,使用自定义的基于模式的压缩(Pingala)。
为什么?
主权:逻辑和数据是本地拥有的。没有API调用。
绿色人工智能:它在我的MacBook的CPU上运行,几乎不产生热量/功耗。
吠陀逻辑:它将帕尼尼的语法规则(Ashtadhyayi)实现为图遍历算法,而不仅仅是统计注意力。
这绝对是实验性的,但它质疑“规模就是一切”的教条。希望能得到关于可逆质状态机逻辑的反馈。
链接:
代码: [https://github.com/akulasairohit/meru-os](https://github.com/akulasairohit/meru-os)
在线演示: [https://huggingface.co/spaces/akulasairohit/panini-demo](https://huggingface.co/spaces/akulasairohit/panini-demo)
宣言: [https://www.linkedin.com/pulse/introducing-meru-os-worlds-first-sovereign-ai-stack-akula-pf68e](https://www.linkedin.com/pulse/introducing-meru-os-worlds-first-sovereign-ai-stack-akula-pf68e)
谢谢,Rohit
查看原文
Hi HN, I've spent the last few weeks building an experimental AI Operating System called Meru OS, based on the idea that intelligence should be verifiable, not probabilistic.<p>Most LLMs are "Black Boxes". I wanted to build a "Glass Box" where every output traces back to a certified source.<p>The Architecture:<p>The Kernel: Instead of vectors, state is represented as Integers. We use the Fundamental Theorem of Arithmetic (Unique Prime Factorization) to encode concepts.
Example: If Time = 3 and Space = 5, then Spacetime = 15. This allows for reversible "time-travel debugging" by just dividing integers.
The Hypervisor (Bija): It doesn't just run cycles; it runs "Frequency Modulations".
The kernel itself is coded in specific Prime Frequencies (Hz).
Execution is effectively a "Resonance" state between the Instruction and the Data, rather than a fetch-decode-execute pipeline.
The Data (< 2MB): We compressed the entire Constitution of India, the IPC, and an Indo-European Etymology Dictionary into a <2MB bundle using a custom schema-based compression (Pingala).
Why?<p>Sovereignty: The logic and data are locally owned. No API calls.
Green AI: It runs on my MacBook's CPU with negligible heat/power.
Vedic Logic: It implements Panini's Grammar rules (Ashtadhyayi) as a graph traversal algorithm rather than just statistical attention.
It's definitely experimental, but it questions the "Scale is All You Need" dogma. Would love feedback on the reversible prime state machine logic.<p>Links:<p>Code: <a href="https://github.com/akulasairohit/meru-os" rel="nofollow">https://github.com/akulasairohit/meru-os</a>
Live Demo: <a href="https://huggingface.co/spaces/akulasairohit/panini-demo" rel="nofollow">https://huggingface.co/spaces/akulasairohit/panini-demo</a>
The Manifesto: <a href="https://www.linkedin.com/pulse/introducing-meru-os-worlds-first-sovereign-ai-stack-akula-pf68e" rel="nofollow">https://www.linkedin.com/pulse/introducing-meru-os-worlds-fi...</a><p>Thanks, Rohit