logologo
Get Started
Guide
Development
Plugins
API
Home
English
简体中文
日本語
한국어
Español
Português
Deutsch
Français
Русский
Italiano
Türkçe
Українська
Tiếng Việt
Bahasa Indonesia
ไทย
Polski
Nederlands
Čeština
العربية
עברית
हिन्दी
Svenska
Get Started
Guide
Development
Plugins
API
Home
logologo
Overview
Quick Start

Features

Configure LLM Service
Enable AI Employees
Collaborate with AI Employees
Add Context - Blocks
Use Skills
Shortcuts
Web Search
Create AI Employee
Built-in AI Employees
Permissions
File Management

AI Knowledge Base

Overview
Vector Database
Vector Store
Knowledge Base
RAG

Workflow

LLM Node

Text Chat
Multimodal Chat
Structured Output

Application Practices

Scenarios

Viz: CRM Scenario Configuration

Configuration

Admin Configuration
Prompt Engineering Guide
Previous PageFile Management
Next PageVector Database

#Overview

#Introduction

The AI Knowledge Base plugin provides RAG retrieval capabilities for AI agents.

RAG retrieval capabilities allow AI agents to provide more accurate, professional, and enterprise-relevant answers when responding to user questions.

Using professional domain and internal enterprise documents from the administrator-maintained knowledge base improves the accuracy and traceability of AI agent responses.

#What is RAG

RAG (Retrieval Augmented Generation) stands for "Retrieval-Augmented-Generation".

  • Retrieval: The user's question is converted into a vector by an Embedding model (e.g., BERT). Top-K relevant text chunks are recalled from the vector library through dense retrieval (semantic similarity) or sparse retrieval (keyword matching).
  • Augmentation: The retrieval results are concatenated with the original question to form an augmented prompt, which is then injected into the LLM's context window.
  • Generation: The LLM combines the augmented prompt to generate the final answer, ensuring factuality and traceability.

#Installation

  1. Go to the Plugin Manager page
  2. Find the AI: Knowledge base plugin and enable it

20251022224818