Skip to main content

What You Can Do

Local LLM + Linkly AI local knowledge base — data never leaves your machine. Linkly AI’s read tool reads confidential contracts, medical records, and other sensitive content stored on your computer directly, with no data ever uploaded to the cloud — ideal for scenarios with strict privacy requirements.
The Linkly AI service runs locally, so even without network access, the local model can use the search tool to search your knowledge base and respond — no dependency on any cloud service.
Using the same Linkly AI knowledge base as a benchmark, run retrieval and Q&A with models like Llama, Mistral, and Qwen, comparing which model is better at understanding your document types.

Prerequisites

  • LM Studio installed
  • Linkly AI Desktop running with the MCP service active

Configuration Steps

1

Open MCP settings

In LM Studio, find the MCP server configuration (the exact location may vary by version).
2

Add Linkly AI MCP

Add a new MCP server and fill in: - Name: linkly-ai - URL: http://127.0.0.1:60606/mcp
3

Verify the connection

Try searching your documents in a conversation to verify the connection is successful.
MCP support in LM Studio may vary by version. If your version does not yet support MCP, please update to the latest version.

FAQ

Make sure you are using a version of LM Studio that supports MCP. This feature is available in newer versions.
Check whether Linkly AI Desktop is running and whether the MCP service is active.