This page answers some of the questions Linkly AI users ask most often.Documentation Index
Fetch the complete documentation index at: https://linkly.ai/docs/llms.txt
Use this file to discover all available pages before exploring further.
How does Linkly AI protect user data privacy?
Linkly AI is built on a local-first architecture. Your documents, full-text index, vector index, and the embedding model itself all run on your device — nothing is uploaded to any server by default.What stays on your device
- Document originals: stay in their original folders. Linkly AI only reads them — it does not copy or relocate them.
- Full-text index (BM25): built locally with Tantivy.
- Vector index: stored in a local database.
- Embedding model: Qwen3-Embedding-0.6B (GGUF quantized), running locally via llama.cpp. Apple Silicon Macs automatically use Metal GPU acceleration.
- App logs: written to local files only — never auto-uploaded.
Where chat data goes
When the chatbot calls a large language model, where the request goes depends on the provider you choose:Local model
Ollama, LM Studio, or any other OpenAI-compatible local service. Data stays
entirely on your machine.
Linkly Official
Forwarded to a third-party model provider via
api.linkly.ai. Requests pass
through Linkly’s servers.Third-party direct
Connects directly to OpenAI, Anthropic, etc. Requests do not pass through
Linkly’s servers.
User Experience Improvement Program (telemetry)
To help us understand which features are used and how the app is running, Linkly AI sends an anonymous usage report by default:| What is reported | What is NOT reported |
|---|---|
| Feature usage counters (aggregated by action) | Document content, file names, paths |
| App version, OS, architecture | Chat content, search queries |
| A locally generated random device ID | API keys, custom URLs |
Privacy commitments
- No third-party analytics SDKs (Google Analytics, etc.).
- We do not read your browser history or clipboard.
- No mandatory account login — core features work offline.
- App logs stay on your machine. They are sent to us only if you explicitly share them.
How long does Linkly AI take to finish indexing?
Indexing time depends on the number of files, file types, machine performance, and indexing mode. Linkly AI indexes in three stages:Filename quick index (seconds)
As soon as files are discovered, their paths and names are written to the
full-text index, so you can search by filename even while content indexing
is still running.
Full-text extraction and BM25 index (minutes to hours)
Document contents (txt, md, html, docx, pdf, images) are parsed; outlines
and metadata are written to Tantivy. Multiple workers can run in parallel.
What affects speed
- File count: roughly linear with total time.
- File format: plain text is fastest; PDFs require page parsing.
- Machine performance: Apple Silicon Macs use Metal GPU acceleration for embeddings, which is significantly faster than CPU inference. On Windows / Linux, embedding currently runs on CPU.
- Indexing mode: pick
Performance / Balanced / Autoin Settings → Indexing. Performance mode uses higher concurrency and more CPU; Auto upgrades to Performance when the system is idle.
Rough expectations
These are order-of-magnitude estimates only — actual times vary considerably with hardware and file mix:| Scenario | Approximate time |
|---|---|
| Thousands of plain-text files (txt / md), M-series Mac | A few minutes |
| Tens of thousands of mixed formats (some PDFs), M-series Mac | Tens of minutes to a few hours |
| Many images requiring OCR | Significantly longer |
| Same workload on Windows / Linux pure CPU | Slower than Mac |
You can watch indexing progress live (indexed / pending) at the top of the
launcher. Indexing runs in the background and does not block search — once
filename indexing is done, search is immediately available.
How do I get Linkly AI’s application runtime logs?
While running, the app automatically writes its full log to a localapp.log file (capped at 2 MB per file, with rotation). Attaching this file when you report an issue dramatically speeds up debugging. Sensitive data is already redacted.
There are two ways to grab the log:
Option 1: Open from inside the app (recommended)
This is the simplest path:Open the data directory
Find the Data Directory row and click the Open button on the right.
Your file manager will open at the folder that holds the app’s data.
Option 2: Open the data directory manually
If the app has crashed or won’t launch, open the folder directly from disk:- macOS
- Windows
- Linux
- Open Finder.
- From the menu bar, choose Go → Go to Folder… (or press ⌘ + ⇧ + G).
-
Paste this path and press Return:
-
Find
app.login that folder and send it to us.
If even those paths do not exist, the app must have crashed before writing any
logs. In that case, take a screenshot of the crash dialog or window and send
it along with your OS version.

