Comparison
DearTech-OS vs LLM Wiki
Andrej Karpathy's LLM Wiki is an open methodology: treat personal and team knowledge as an AI-readable wiki. DearTech-OS shares the same file-first, markdown-canonical principles and adds a typed knowledge graph, an MCP server, and a team layer on top, so a founder-operator team can adopt the methodology in weeks instead of building it from scratch.
Last updated:
Side by side
Where they differ
| Criterion | LLM Wiki | DearTech-OS |
|---|---|---|
| Approach | Open methodology, DIY | Maintained system built on the same principles |
| File format | Markdown files (your choice) | Markdown files (canonical, typed frontmatter) |
| Graph structure | Implicit (wiki-style links between files) | Explicit typed graph: typed nodes, status, confidence, relationships |
| Graph search & traversal | Full-text search across markdown, no typed traversal | AI searches typed nodes, traverses relationships, audits sources out of the box |
| Query interface | Whatever you build | MCP server out of the box, plus CLI and search |
| AI tool integration | DIY | Claude, ChatGPT, Cursor, Codex via MCP |
| Sharing model | Personal or single-team wiki | Multi-team, role-based access (viewer, maintainer, admin) |
| Time to team adoption | Weeks to months of building | 4-8 weeks of guided implementation |
| Best for | Solo developers and small teams who want full control and custom workflows | Founder-operator companies who want a maintained Context OS without building it themselves |
Where LLM Wiki excels
- Free, open methodology with no vendor
- Full control over schema, structure, and tooling
- Aligned with the developer mindset of file-first knowledge
- Strong starting point if you have time and engineering capacity
Where DearTech-OS excels
- Built-in typed knowledge graph (concepts, patterns, people, status, confidence)
- Built-in MCP server. Claude, ChatGPT, Cursor query the graph natively.
- Built-in role-based access (viewer, maintainer, admin) via SSO
- Audit, healing, and crystallization tools maintained for you
- Faster time to team adoption, measured in weeks not months
- Same file-first principles. Your markdown stays canonical and portable.
Decision
When to choose which
Choose LLM Wiki when
- You are a solo developer or small technical team
- You want full control over schema and tooling
- You have engineering time to build and maintain it yourself
- Role-based access and team scaling are not pressing concerns yet
Choose DearTech-OS when
- You are a founder-operator with a team that needs shared context
- You want to skip the build-it-yourself phase and adopt a maintained system
- You need role-based access for non-technical teammates
- You want a typed graph and MCP server out of the box, working with Claude and ChatGPT day one
Use them together
DearTech-OS is the same methodology, shipped
DearTech-OS doesn't replace the LLM Wiki philosophy, it operationalizes it. Your markdown stays canonical. Your file-first principles stay intact. The graph, MCP server, and team layer sit on top, so the same methodology you would have built yourself works across Claude, ChatGPT, Cursor, and any MCP-compatible AI tool from week one.
FAQ
Common questions
Can I migrate from an LLM Wiki setup to DearTech-OS?
Yes. Both are markdown-first and file-canonical, so most LLM Wiki setups migrate cleanly. The DearTech-OS audit identifies what already maps to typed graph nodes and what needs structure added.
Why pay for a maintained system if the methodology is open?
Same reason teams pay for any maintained software: faster time to value, role-based access without DIY auth, an MCP server that already works with Claude and ChatGPT, and a typed graph schema that has been refined across multiple deployments.
Does DearTech-OS lock me in?
No. Your markdown files are canonical and portable. The graph is parsed from those files. If you ever stop using DearTech-OS, your knowledge stays exactly where it is: on your disk, in markdown, fully portable.
Is DearTech-OS open source?
DearTech-OS is shipped, licensable software. The methodology is open and we publish about it. The implementation, MCP server, role system, and tooling are the productized layer.
Want the same principles, shipped?
In a 30-minute call, we'll map where your company knowledge is scattered today and identify the first DearTech-OS layer worth building.
Let's chat