logologo
Get Started
Guide
Development
Plugins
API
English
简体中文
Get Started
Guide
Development
Plugins
API
English
简体中文
logologo
Overview

Quick Start

Configure LLM Service
Create AI Employee
Collaborate with AI Employee

Built-in AI Employees

Overview
Viz: Insight Analyst
Orin: Data Modeling Expert
Dex: Data Organizer
Nathan: Frontend Engineer

Advanced

Block Selection
Data Sources
Skills
Tasks
Web Search
Access Control
File Management

Workflow

LLM Nodes

Text Chat
Multimodal Chat
Structured Output

AI Knowledge Base

Overview
Vector Database
Vector Store
Knowledge Base
RAG

Application Documentation

Scenarios

Viz: CRM Scenario Configuration

Configuration

Admin Configuration
Prompt Engineering Guide
Previous PageOverview
Next PageCreate AI Employee

#Quick Start

#Introduction

Before using the AI Employee, you need to connect to an online LLM service. NocoBase currently supports mainstream online LLM services such as OpenAI, Gemini, Claude, DepSeek, Qwen, etc. In addition to online LLM services, NocoBase also supports connecting to Ollama local models.

#Configure LLM Service

Go to the AI Employee plugin configuration page, click the LLM service tab to enter the LLM service management page.

20251021213122

Hover over the Add New button in the upper right corner of the LLM service list and select the LLM service you want to use.

20251021213358

Taking OpenAI as an example, enter an easy-to-remember title in the pop-up window, then enter the API key obtained from OpenAI, and click Submit to save. This completes the LLM service configuration.

The Base URL can usually be left blank. If you are using a third-party LLM service that is compatible with the OpenAI API, please fill in the corresponding Base URL.

20251021214549

#Availability Test

On the LLM service configuration page, click the Test flight button, enter the name of the model you want to use, and click the Run button to test whether the LLM service and model are available.

20251021214903