If you’re a heavy knowledge worker, you’ve probably been through these phases:
Phase one: you clipped hundreds of articles into Evernote, built an elaborate folder hierarchy, and felt like you had everything under control.
Phase two: you realized your Evernote had thousands of notes you almost never revisited. So you switched to Notion or Obsidian, started building bidirectional links, and watched your knowledge graph grow with excitement.
Phase three: you discovered a wave of “AI knowledge base” tools, hoping AI would finally help you put your knowledge to use — only to find the AI answers still disappointing.
Each phase brought efficiency gains from new technology, then hit a wall from technology’s limitations.
Every generation of PKM tools solved the core pain of the previous generation — while accumulating new limitations of its own. Understanding this evolutionary arc is the key to knowing what knowledge management in the AI era truly needs.
Era 1.0: Tree Storage and Passive Management (2000s–2010s)
The defining tools of this era were Evernote, OneNote, and WizNote. They addressed a genuine problem of the early internet age: information preservation.
The core logic was essentially a “filing cabinet” — you saw valuable content, stored it, categorized it into folders, and added tags. The system’s philosophy: put information in the right place, and you’ll find it when you need it. Similar in spirit to knowledge management frameworks like PARA.
This system worked reasonably well at small scale. But it contained a fatal assumption: you can predict how a piece of information will be used in the future.
Reality rarely cooperates. An article about “AI applications in education” — does it go in the “Technology” folder or the “Education” folder? Whichever you choose, the next time you need it in a different context, you might not find it. Knowledge is inherently cross-domain, but tree structures force you to put it in a single box.
As your note count grows, the cognitive cost of maintaining a classification system rises exponentially. Around a few thousand notes, things usually start spiraling out of control. Users develop classification anxiety — every new note triggers a decision about where it goes and what tags it needs. Ultimately, most users fall into the same trap: collect but never use. Notes become a digital warehouse you check into but never check out of.
Era 1.0 preserved information, but its utilization efficiency was extremely low.
Era 2.0: Networked Connections and Active Linking (2019–Present)
In 2019, Roam Research launched with a disruptive concept: bidirectional links.
Obsidian followed in 2020, quickly becoming the new benchmark in the knowledge management community with its local-file-first approach and open plugin ecosystem. Notion, Feishu Docs, and others soon adopted similar features. Other key tools of this era include Logseq and Heptabase.
The philosophical foundation of Era 2.0 comes from German sociologist Niklas Luhmann’s Zettelkasten (slip-box) method. Luhmann built a knowledge network spanning decades using 90,000 handwritten index cards, producing a prolific body of work. His core methodology: don’t preset categories — let notes form natural associations.
Roam and Obsidian digitized this approach. You create links between notes using [[keyword]] syntax, and a knowledge graph visualizes automatically, revealing hidden connections you never noticed. A note on “deep work” can simultaneously link to “flow theory” and “Pomodoro technique” — no need to pick a side.
This was a genuine paradigm shift: from “archiving” to “thinking,” from “finding information” to “discovering connections.” Era 2.0 tools were the first to make note-taking software participate in the creative process itself, not just serve as an information warehouse.
But Era 2.0 had its own limitations:
First, bidirectional links require manual maintenance. Every link requires a conscious act on your part. As your note count grows, you need to remember “what have I written about this before” in order to make meaningful connections — which actually demands more from your memory, not less.
Second, the knowledge graph becomes increasingly complex. Once there are enough nodes and connections, the visualization shifts from “helping you discover relationships” to “an incomprehensible tangle.” Many users find their graph unreadably dense once they cross a few thousand notes.
Third, the tooling itself becomes a burden. Obsidian has hundreds of plugins. Users spend enormous amounts of time configuring workflows, selecting plugins, and designing templates — the maintenance of the “note system” starts eating into the time meant for actual thinking and writing.
Era 2.0 “connected” knowledge, but as volume and complexity grew, new efficiency bottlenecks emerged.
Era 3.0: AI-Driven Intelligent Collaboration (2023–Present)
The explosion of large language models showed everyone a tantalizing possibility: let AI help you manage and use your knowledge.
So the past two years have seen a surge of “AI PKM” tools: Notion AI, Obsidian AI plugins, various AI note apps. Most products share a similar approach — upload your notes to the cloud, run RAG (Retrieval-Augmented Generation), and you can “chat with your notes.”
The direction is right. But there’s a widely overlooked problem:
These tools’ AI capabilities are confined within the application.
Notion AI only understands what’s inside Notion. Obsidian’s AI plugin only searches the current Vault. All the knowledge you’ve accumulated elsewhere — hundreds of PDF ebooks, historical notes exported from old platforms, research reports built up at work, academic papers downloaded locally — AI still can’t reach it.
You’d have to collect, migrate, copy, and rewrite content into these new products first. Anyone who’s gone through a note migration knows how much work that is.
This creates a paradox: when you want AI to work on “your complete knowledge base,” you first need to migrate everything into one tool. But no single tool can serve all your use cases.
The reality: most people’s knowledge remains fragmented, scattered across multiple tools, with AI only able to see one corner of it.
The Essence of Three Generations of Evolution
Looking back at these three shifts, the core is about which level each generation addressed:
| Era | Core Problem | Core Answer |
|---|---|---|
| 1.0 | Information preservation | Don’t lose it |
| 2.0 | Knowledge connection | Be able to find it |
| 3.0 | Wisdom creation | Be able to apply it |
Each evolution brought enormous benefits. 1.0 solved the problem of information scattered and unfindable. 2.0 created meaningful connections between knowledge. 3.0’s goal is to let AI truly participate — helping you convert knowledge into insights and action.
But for 3.0 to truly deliver “being able to apply,” there’s a prerequisite: AI needs to see all your knowledge, not just what’s in one particular tool.
Where Linkly AI Fits: The Unified Retrieval Layer for Era 3.0
Linkly AI is not another PKM tool. It’s not here to replace Obsidian or Notion.
It addresses the structural problem described above: how to let AI work across all tool boundaries, on your complete knowledge base.
Think of it as a layered collaboration architecture:
- Obsidian is your writing and deep-thinking tool — essentially a local Markdown folder, inherently local-first.
- Notion is your collaboration and project management tool — exportable as Markdown or HTML.
- Roam Research is your slip-box note tool — supports exporting JSON and Markdown.
- Linkly AI is the unified interface that lets AI read and understand all of this data.
This is layered collaboration, not a replacement relationship. You keep using your preferred tools for writing and thinking. Linkly AI builds an index in the background, so your Claude, Cursor, or ChatGPT can cross all tool boundaries to retrieve and use the knowledge you’ve accumulated.
Users prefer AI entry points like Claude and ChatGPT over in-app AI like Notion AI — and they definitely don’t want to pay again for a more limited AI interface.
Linkly AI’s technical core is Outlines Index technology: building an “AI profile card” for each document — containing metadata and a structured outline — so AI can progressively explore your documents like a researcher browsing a filing cabinet, rather than having documents chopped into fragments and fed to vector search.
In Practice: Building a Unified AI Knowledge Base
Enough theory. Here’s how to actually do it.
Step 1: Integrate Your Obsidian Vault
Obsidian is just local Markdown files in a folder on your computer. Open Linkly AI and add your Obsidian Vault directory.
That’s it. All the notes, book summaries, journals, and research excerpts you’ve accumulated in Obsidian over the years instantly become a private knowledge base that Claude can actively search.
Step 2: Export Your Notion Data
In Notion, select the workspace or pages you want AI to access, export as Markdown + CSV format, and place them in a local folder. Add that folder to Linkly AI.
Step 3: PDFs, Papers, and Other Documents
Organize your research PDFs, academic papers, ebooks, and work reports into one or a few folders, then add them to Linkly AI.
Once configured, the knowledge you’ve accumulated across different tools over the years — notes, papers, documents — all becomes a unified AI-searchable knowledge base. New files added later are automatically indexed in the background, with no action required from you.
A Real Conversation Scenario
With setup complete, you can ask Claude things like:
“What have I written in Obsidian over the past few years about deep work? Can you pull together the key ideas?”
Claude will call Linkly AI’s search tools, find all relevant notes, and synthesize your own thinking into a coherent summary. The content is your own years of accumulated thought — not a generic answer AI pulled from the internet.
Or:
“Help me review the research papers I’ve collected on Transformer architectures and find the common conclusions.”
Claude will search your local PDF library, compare multiple papers, and produce a synthesized analysis.
This capability matters because AI is working on your context, not generic internet knowledge. The more you’ve accumulated, the more valuable AI can be.
On Privacy: Your Knowledge Doesn’t Need to Leave Your Computer
Many people worry about privacy when using AI to process personal knowledge.
All of Linkly AI’s processing — text extraction, index building, search matching — happens locally. Your files are never uploaded to any server. No cloud sync. No telemetry collection. Index files are stored locally, and original documents never leave your computer.
We cover this in more detail in Why Local Runs Better.
No Silver Bullet — But This Is the Most Pragmatic Path
Honestly: Linkly AI can’t solve everything.
If your Obsidian notes are low quality and logically scattered, AI won’t be able to produce good analysis from them either. No tool can substitute for your own thinking and accumulation.
But it solves a real infrastructure problem: knowledge you’ve built up over years, scattered across multiple tools and formats, permanently out of AI’s reach. Configure Linkly AI once, and that barrier disappears.
We think this is the most pragmatic starting point for PKM 3.0: no data migration, no tool switching, no reorganizing your file structure. Keep your existing workflow, and let AI work on the knowledge base you already have.
If you want to get started quickly, check out the Quickstart guide — configuration takes about 10 minutes.
The ultimate goal of PKM has never been “storage” — it’s always been “an extension of thinking.”
Every generation of tooling has shortened the distance between “information” and “insight.” This generation’s breakthrough isn’t about which tool is smarter. It’s about whether your AI assistant can cross all tool boundaries and work on your real, complete knowledge base.
Give AI a way in. Let it truly read you.
Learn more about Linkly AI, or head to linkly.ai to download and try it.
