feat: The international version adds MaxKB
This commit is contained in:
parent
1b802ed1ed
commit
c8874a016f
|
|
@ -5,6 +5,15 @@ additionalProperties:
|
||||||
envKey: PANEL_APP_PORT_HTTP
|
envKey: PANEL_APP_PORT_HTTP
|
||||||
labelEn: Port
|
labelEn: Port
|
||||||
labelZh: 端口
|
labelZh: 端口
|
||||||
|
label:
|
||||||
|
en: Port
|
||||||
|
ja: ポート
|
||||||
|
ms: Port
|
||||||
|
pt-br: Porta
|
||||||
|
ru: Порт
|
||||||
|
ko: 포트
|
||||||
|
zh-Hant: 埠
|
||||||
|
zh: 端口
|
||||||
required: true
|
required: true
|
||||||
rule: paramPort
|
rule: paramPort
|
||||||
type: number
|
type: number
|
||||||
|
|
@ -7,10 +7,11 @@ password:MaxKB@123..
|
||||||
|
|
||||||
# MaxKB
|
# MaxKB
|
||||||
|
|
||||||
**MaxKB** 是一款基于 LLM 大语言模型的知识库问答系统
|
**MaxKB** = Max Knowledge Base,是一款基于大语言模型和 RAG 的开源知识库问答系统,广泛应用于智能客服、企业内部知识库、学术研究与教育等场景。
|
||||||
|
|
||||||
## 优势:
|
## 优势:
|
||||||
|
|
||||||
- **多模型支持**:支持对接主流的大模型,包括本地私有大模型(如 Llama 2)、Azure OpenAI 和百度千帆大模型等;
|
- **开箱即用**:支持直接上传文档 / 自动爬取在线文档,支持文本自动拆分、向量化和 RAG(检索增强生成),有效减少大模型幻觉,智能问答交互体验好;
|
||||||
- **开箱即用**:支持直接上传文档、自动爬取在线文档,支持文本自动拆分、向量化,智能问答交互体验好;
|
- **模型中立**:支持对接各种大模型,包括本地私有大模型(DeekSeek R1 / Llama 3 / Qwen 2 等)、国内公共大模型(通义千问 / 腾讯混元 / 字节豆包 / 百度千帆 / 智谱 AI / Kimi 等)和国外公共大模型(OpenAI / Claude / Gemini 等);
|
||||||
- **无缝嵌入**:支持零编码快速嵌入到第三方业务系统。
|
- **灵活编排**:内置强大的工作流引擎和函数库,支持编排 AI 工作过程,满足复杂业务场景下的需求;
|
||||||
|
- **无缝嵌入**:支持零编码快速嵌入到第三方业务系统,让已有系统快速拥有智能问答能力,提高用户满意度。
|
||||||
|
|
@ -0,0 +1,17 @@
|
||||||
|
# Default admin credentials
|
||||||
|
|
||||||
|
```
|
||||||
|
username:admin
|
||||||
|
password:MaxKB@123..
|
||||||
|
```
|
||||||
|
|
||||||
|
# MaxKB
|
||||||
|
|
||||||
|
**MaxKB** = Max Knowledge Base, it is a chatbot based on Large Language Models (LLM) and Retrieval-Augmented Generation (RAG). MaxKB is widely applied in scenarios such as intelligent customer service, corporate internal knowledge bases, academic research, and education.
|
||||||
|
|
||||||
|
## Advantage
|
||||||
|
|
||||||
|
- **Ready-to-Use**: Supports direct uploading of documents / automatic crawling of online documents, with features for automatic text splitting, vectorization, and RAG (Retrieval-Augmented Generation). This effectively reduces hallucinations in large models, providing a superior smart Q&A interaction experience.
|
||||||
|
- **Flexible Orchestration**: Equipped with a powerful workflow engine and function library, enabling the orchestration of AI processes to meet the needs of complex business scenarios.
|
||||||
|
- **Seamless Integration**: Facilitates zero-coding rapid integration into third-party business systems, quickly equipping existing systems with intelligent Q&A capabilities to enhance user satisfaction.
|
||||||
|
- **Model-Agnostic**: Supports various large models, including private models (such as DeepSeek, Llama, Qwen, etc.) and public models (like OpenAI, Claude, Gemini, etc.).
|
||||||
|
|
@ -1,15 +1,24 @@
|
||||||
name: MaxKB
|
name: MaxKB
|
||||||
tags:
|
tags:
|
||||||
- AI / 大模型
|
- AI / 大模型
|
||||||
title: 基于 LLM 大语言模型的知识库问答系统
|
title: 基于大语言模型和 RAG 的开源知识库问答系统
|
||||||
description: 基于 LLM 大语言模型的知识库问答系统
|
description: 基于大语言模型和 RAG 的开源知识库问答系统
|
||||||
additionalProperties:
|
additionalProperties:
|
||||||
key: maxkb
|
key: maxkb
|
||||||
name: MaxKB
|
name: MaxKB
|
||||||
tags:
|
tags:
|
||||||
- AI
|
- AI
|
||||||
shortDescZh: 基于 LLM 大语言模型的知识库问答系统
|
shortDescZh: 基于大语言模型和 RAG 的开源知识库问答系统
|
||||||
shortDescEn: Knowledge Base Question Answering System based on LLM large language model
|
shortDescEn: Ready-to-use, flexible RAG Chatbot
|
||||||
|
description:
|
||||||
|
en: Ready-to-use, flexible RAG Chatbot
|
||||||
|
ja: 使用準備ができて柔軟な RAG チャットボット
|
||||||
|
ms: Chatbot RAG yang fleksibel dan mudah digunakan
|
||||||
|
pt-br: Chatbot RAG flexível e pronto para uso
|
||||||
|
ru: Готовый к использованию, гибкий чат-бот RAG
|
||||||
|
ko: 사용 준비 완료, 유연한 RAG 챗봇
|
||||||
|
zh-Hant: 基於大語言模型和 RAG 的開源知識庫問答系統
|
||||||
|
zh: 基于大语言模型和 RAG 的开源知识库问答系统
|
||||||
type: tool
|
type: tool
|
||||||
crossVersionUpdate: true
|
crossVersionUpdate: true
|
||||||
limit: 0
|
limit: 0
|
||||||
|
|
|
||||||
Loading…
Reference in New Issue