docs(user): mention changes to LLM

This commit is contained in:
Elian Doran
2026-02-22 16:03:27 +02:00
parent 22341bf0b1
commit 72c34eb491
46 changed files with 921 additions and 1614 deletions

Binary file not shown.

Before

Width:  |  Height:  |  Size: 168 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 43 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 172 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 167 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 237 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 202 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 49 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 80 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 191 KiB

View File

@@ -1,89 +1,18 @@
# AI
<figure class="image image_resized" style="width:63.68%;"><img style="aspect-ratio:1363/1364;" src="AI_image.png" width="1363" height="1364"><figcaption>An example chat with an LLM</figcaption></figure>
Starting with version v0.102.0, AI/LLM integration has been removed from the Trilium Notes core.
The AI / LLM features within Trilium Notes are designed to allow you to interact with your Notes in a variety of ways, using as many of the major providers as we can support. 
While a significant amount of effort went into developing this feature, maintaining and supporting it long-term proved to be unsustainable.
In addition to being able to send chats to LLM providers such as OpenAI, Anthropic, and Ollama - we also support agentic tool calling, and embeddings.
When upgrading to v0.102.0, your Chat notes will be preserved, but instead of the dedicated chat window they will be turned to a normal <a class="reference-link" href="Note%20Types/Code.md">Code</a> note, revealing the underlying JSON of the conversation.
The quickest way to get started is to navigate to the “AI/LLM” settings:
## Alternative solutions (MCP)
<figure class="image image_resized" style="width:74.04%;"><img style="aspect-ratio:1916/1906;" src="5_AI_image.png" width="1916" height="1906"></figure>
Given the recent advancements of the AI scene, MCP has grown to be more powerful and facilitates easier integrations with various application.
Enable the feature:
As such, there are third-party solutions that integrate an MCP server that can be used with Trilium:
<figure class="image image_resized" style="width:82.82%;"><img style="aspect-ratio:1911/997;" src="1_AI_image.png" width="1911" height="997"></figure>
* [tan-yong-sheng/triliumnext-mcp](https://github.com/tan-yong-sheng/triliumnext-mcp)
* [perfectra1n/triliumnext-mcp](https://github.com/perfectra1n/triliumnext-mcp)
## Embeddings
**Embeddings** are important as it allows us to have an compact AI “summary” (it's not human readable text) of each of your Notes, that we can then perform mathematical functions on (such as cosine similarity) to smartly figure out which Notes to send as context to the LLM when you're chatting, among other useful functions.
You will then need to set up the AI “provider” that you wish to use to create the embeddings for your Notes. Currently OpenAI, Voyage AI, and Ollama are supported providers for embedding generation.
In the following example, we're going to use our self-hosted Ollama instance to create the embeddings for our Notes. You can see additional documentation about installing your own Ollama locally in <a class="reference-link" href="AI/Providers/Ollama/Installing%20Ollama.md">Installing Ollama</a>.
To see what embedding models Ollama has available, you can check out [this search](https://ollama.com/search?c=embedding) on their website, and then `pull` whichever one you want to try out. A popular choice is `mxbai-embed-large`.
First, we'll need to select the Ollama provider from the tabs of providers, then we will enter in the Base URL for our Ollama. Since our Ollama is running on our local machine, our Base URL is `http://localhost:11434`. We will then hit the “refresh” button to have it fetch our models:
<figure class="image image_resized" style="width:82.28%;"><img style="aspect-ratio:1912/1075;" src="4_AI_image.png" width="1912" height="1075"></figure>
When selecting the dropdown for the “Embedding Model”, embedding models should be at the top of the list, separated by regular chat models with a horizontal line, as seen below:
<figure class="image image_resized" style="width:61.73%;"><img style="aspect-ratio:1232/959;" src="8_AI_image.png" width="1232" height="959"></figure>
After selecting an embedding model, embeddings should automatically begin to be generated by checking the embedding statistics at the top of the “AI/LLM” settings panel:
<figure class="image image_resized" style="width:67.06%;"><img style="aspect-ratio:1333/499;" src="7_AI_image.png" width="1333" height="499"></figure>
If you don't see any embeddings being created, you will want to scroll to the bottom of the settings, and hit “Recreate All Embeddings”:
<figure class="image image_resized" style="width:65.69%;"><img style="aspect-ratio:1337/1490;" src="3_AI_image.png" width="1337" height="1490"></figure>
Creating the embeddings will take some time, and will be regenerated when a Note is created, updated, or deleted (removed).
If for some reason you choose to change your embedding provider, or the model used, you'll need to recreate all embeddings.
## Tools
Tools are essentially functions that we provide to the various LLM providers, and then LLMs can respond in a specific format that tells us what tool function and parameters they would like to invoke. We then execute these tools, and provide it as additional context in the Chat conversation. 
These are the tools that currently exist, and will certainly be updated to be more effectively (and even more to be added!):
* `search_notes`
* Semantic search
* `keyword_search`
* Keyword-based search
* `attribute_search`
* Attribute-specific search
* `search_suggestion`
* Search syntax helper
* `read_note`
* Read note content (helps the LLM read Notes)
* `create_note`
* Create a Note
* `update_note`
* Update a Note
* `manage_attributes`
* Manage attributes on a Note
* `manage_relationships`
* Manage the various relationships between Notes
* `extract_content`
* Used to smartly extract content from a Note
* `calendar_integration`
* Used to find date notes, create date notes, get the daily note, etc.
When Tools are executed within your Chat, you'll see output like the following:
<figure class="image image_resized" style="width:66.88%;"><img style="aspect-ratio:1372/1591;" src="6_AI_image.png" width="1372" height="1591"></figure>
You don't need to tell the LLM to execute a certain tool, it should “smartly” call tools and automatically execute them as needed.
## Overview
To start, simply press the _Chat with Notes_ button in the <a class="reference-link" href="Basic%20Concepts%20and%20Features/UI%20Elements/Launch%20Bar.md">Launch Bar</a>.
<figure class="image image_resized" style="width:60.77%;"><img style="aspect-ratio:1378/539;" src="2_AI_image.png" width="1378" height="539"></figure>
If you don't see the button in the <a class="reference-link" href="Basic%20Concepts%20and%20Features/UI%20Elements/Launch%20Bar.md">Launch Bar</a>, you might need to move it from the _Available Launchers_ section to the _Visible Launchers_ section:
<figure class="image image_resized" style="width:69.81%;"><img style="aspect-ratio:1765/1287;" src="9_AI_image.png" width="1765" height="1287"></figure>
> [!IMPORTANT]
> These solutions are third-party and thus not endorsed or supported directly by the Trilium Notes team. Please address questions and issues on their corresponding repository instead.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 175 KiB