mirror of
https://github.com/agentuniverse-ai/agentUniverse.git
synced 2026-02-09 01:59:19 +08:00
Merge branch 'master' into develop_mcp
This commit is contained in:
@@ -45,7 +45,7 @@ The [configuration file](../../../../examples/sample_apps/traslation_agent_app/i
|
||||
[Translation Of Short Text](../../../../examples/sample_apps/traslation_agent_app/intelligence/test/translation_data/long_text_result.txt)
|
||||
|
||||
### Demonstration Results
|
||||
We can see that using agentUniverse maintains consistency with the results of the orginal translation_agent project, which hasa been successfully replicated.
|
||||
We can see that using agentUniverse maintains consistency with the results of the original translation_agent project, which hasa been successfully replicated.
|
||||
agentUniverse Results:
|
||||
|
||||

|
||||
|
||||
@@ -0,0 +1,119 @@
|
||||
# Guide to Connecting agentUniverse via Chatbox/CherryStudio
|
||||
|
||||
This document guides developers on how to quickly connect to the agentUniverse intelligent service compliant with OpenAI protocols using ChatBox or CherryStudio tools.
|
||||
|
||||
## I. Environment Preparation
|
||||
|
||||
### 1. Install Client Tools
|
||||
Install one of the following tools:
|
||||
* ChatBox
|
||||
[Download Link](https://chatboxai.app/zh#download)
|
||||
* CherryStudio
|
||||
[Download Link](https://cherry-ai.com/download)
|
||||
|
||||
### 2. Prepare agentUniverse Project
|
||||
Refer to the aU [Quick Start](https://github.com/antgroup/agentUniverse/blob/master/README_zh.md) documentation to create an agentUniverse project and successfully launch the `demo_agent` sample application.
|
||||
|
||||
## II. Create OpenAI Protocol-Compliant Agent
|
||||
|
||||
### 1. Define Agent Logic
|
||||
Create `openai_protocol_agent.py` under the project path `intelligence/agentic/agent/agent_template/`:
|
||||
|
||||
```python
|
||||
from agentuniverse.agent.input_object import InputObject
|
||||
from agentuniverse.agent.template.openai_protocol_template import OpenAIProtocolTemplate
|
||||
|
||||
class DemoOpenAIProtocolAgent(OpenAIProtocolTemplate):
|
||||
def input_keys(self) -> list[str]:
|
||||
return ['input']
|
||||
|
||||
def output_keys(self) -> list[str]:
|
||||
return ['output']
|
||||
|
||||
def parse_input(self, input_object: InputObject, agent_input: dict) -> dict:
|
||||
agent_input['input'] = input_object.get_data('input')
|
||||
return agent_input
|
||||
|
||||
def parse_result(self, agent_result: dict) -> dict:
|
||||
return {**agent_result, 'output': agent_result['output']}
|
||||
```
|
||||
**Note**: To ensure the agent's output complies with the OpenAI protocol, it must inherit from the `OpenAIProtocolTemplate` class.
|
||||
|
||||
### 2. Configure Agent Instance
|
||||
Create `openai_protocol_agent.yaml` in the `intelligence/agentic/agent/agent_instance/` directory:
|
||||
|
||||
```yaml
|
||||
info:
|
||||
# Basic agent info (sample below)
|
||||
name: 'openai_protocol_agent'
|
||||
description: 'demo agent'
|
||||
profile:
|
||||
# Agent profile (sample below)
|
||||
prompt_version: demo_agent.cn
|
||||
# LLM configuration
|
||||
llm_model:
|
||||
# Replace with your customized LLM if needed
|
||||
# e.g., 'demo_llm' defined in /intelligence/agentic/llm/demo_llm.yaml
|
||||
name: 'qwen_25_72b_llm'
|
||||
action:
|
||||
# Tools and knowledge base
|
||||
tool:
|
||||
# Using a mock_search_tool for demonstration
|
||||
# Replace with a real search tool (e.g., demo_search_tool) and configure API keys in /config/custom_key.toml
|
||||
- 'mock_search_tool'
|
||||
knowledge:
|
||||
# Advanced features (refer to documentation)
|
||||
memory:
|
||||
name: 'demo_memory'
|
||||
metadata:
|
||||
type: AGENT
|
||||
class: DemoOpenAIProtocolAgent
|
||||
```
|
||||
|
||||
### 3. Create Service Interface
|
||||
Create `openai_agent_service.yaml` in the `intelligence/service/agent_service/` directory:
|
||||
|
||||
```yaml
|
||||
name: 'openai_service'
|
||||
description: 'demo service of demo agent'
|
||||
agent: 'openai_protocol_agent'
|
||||
metadata:
|
||||
type: 'SERVICE'
|
||||
```
|
||||
|
||||
### 4. Start agentUniverse Service
|
||||
Launch the agentUniverse service using `bootstrap/intelligence/server_application.py`. After successful startup, test the service with the following cURL command:
|
||||
|
||||
```shell
|
||||
curl -X POST http://127.0.0.1:8888/chat/completions \
|
||||
-H 'Content-Type: application/json' \
|
||||
-d '{
|
||||
"model": "openai_service",
|
||||
"messages": [{
|
||||
"role": "user",
|
||||
"content": "巴菲特抛售比亚迪的原因"
|
||||
}],
|
||||
"stream": true
|
||||
}'
|
||||
```
|
||||
Response example:
|
||||
<img src="../../../_picture/openai_curl_result.png" width="600" />
|
||||
|
||||
## III. Configure Client Tools
|
||||
|
||||
1. Launch Chatbox and open a chat window. The main interface:
|
||||
<img src="../../../_picture/chatbox_main_page.png" width="600" />
|
||||
2. Click the settings button to configure:
|
||||
<img src="../../../_picture/chatbox_setting_page.png" width="600" />
|
||||
* Key configurations:
|
||||
* **API Domain**: `http://127.0.0.1:8888`
|
||||
* **API Path**: `/chat/completions`
|
||||
* **Model**: `openai_service` (your agent service name)
|
||||
* **API Key**: Any placeholder value
|
||||
* **Name**: Customizable
|
||||
3. Save settings.
|
||||
|
||||
## IV. Testing
|
||||
|
||||
- Enter any message in Chatbox and click send to view the agent's response:
|
||||
<img src="../../../_picture/chatbox_test_result.png" width="600" />
|
||||
@@ -62,7 +62,7 @@ The operational process is illustrated in the figure below:
|
||||
|
||||
## Custom Knowledge Insertion Feature
|
||||
### Create Custom Knowledge
|
||||
On the product homepage, naviagte to the Knowledge tab, and click the 'Add Knowledge' button located to the right to create a custom knowledge base.
|
||||
On the product homepage, navigate to the Knowledge tab, and click the 'Add Knowledge' button located to the right to create a custom knowledge base.
|
||||
|
||||
agentUniverse will automatically generate the corresponding knowledge and storage YAML files locally, thereby assisting users in completing their development tasks.
|
||||
|
||||
|
||||
@@ -8,7 +8,7 @@ In this section, we will show you how to:
|
||||
|
||||
## Environment and Application Engineering Preparation
|
||||
### Application Engineering Preparation
|
||||
We have placed the sample of product module within the sample_standard_app project of agentUniverse. You can access and view them [here](../../../../../examples/sample_standard_app/platform/difizen/product). These modules can be configured in the background using YAML file, and additionally, the functions can be automatically created and managed via the product page.
|
||||
We have placed the sample of product module within the sample_standard_app project of agentUniverse. You can access and view them [here](../../../../../examples/sample_apps/difizen_app). These modules can be configured in the background using YAML file, and additionally, the functions can be automatically created and managed via the product page.
|
||||
|
||||
|
||||
### Installing Dependencies
|
||||
@@ -34,7 +34,7 @@ Of course, when utilizing the agent, you need to preconfigure the various LLM mo
|
||||
|
||||
## Using the agentUniverse Product Platform
|
||||
### Starting the Product Service
|
||||
To start the product service with a single click, run the [product_application](../../../../../examples/sample_standard_app/bootstrap/platform/product_application.py) file located in `sample_standard_app/bootstrap/platform` .
|
||||
To start the product service with a single click, run the [product_application](../../../../../examples/sample_apps/difizen_app/bootstrap/platform/product_application.py) file located in `sample_standard_app/bootstrap/platform` .
|
||||

|
||||
|
||||
Upon successful initiation, it will automatically redirect you to the product homepage, which features system presets as well as your customized Agent, Tool and Knowledge product modules.
|
||||
|
||||
@@ -183,7 +183,7 @@ metadata:
|
||||
```
|
||||
Parameter Description:
|
||||
method: the method of the request, such as GET, POST, PUT, etc.
|
||||
headers: the HTTP headers neccessary for sending the request.
|
||||
headers: the HTTP headers necessary for sending the request.
|
||||
json_parse: indicates whether the input parameters should be serialized as JSON and sent in the request body (True for POST requests) or not (False for GET requests, where parameters are typically sent as a query string).
|
||||
response_content_type: the output format for the HTTP request result. If set to 'json', the result will be returned in JSON format; if set to 'text', it will be returned as plain text.
|
||||
This tool can be used directly without requiring any keys.
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# Logging Utils
|
||||
|
||||
The log component of agentUniverse is implemented based on loguru, offering a well-packaged global log component and customizable log components. This tutorial will first introduce the usage of the log configuration file within agentUniverse, then sequentially explain the utilization of the global log component and customizable log components, and finally describe how to intergrate with external log service components.
|
||||
The log component of agentUniverse is implemented based on loguru, offering a well-packaged global log component and customizable log components. This tutorial will first introduce the usage of the log configuration file within agentUniverse, then sequentially explain the utilization of the global log component and customizable log components, and finally describe how to integrate with external log service components.
|
||||
|
||||
## Log Configuration
|
||||
|
||||
|
||||
@@ -10,7 +10,7 @@ max_workers = 10
|
||||
server_port = 50051
|
||||
```
|
||||
- **activate**: The gRPC server will only start when this value is set to `true`.
|
||||
- **max_workers**: TThe maximum number of threads in the gRPC server thread pool, with a default of 10.
|
||||
- **max_workers**: The maximum number of threads in the gRPC server thread pool, with a default of 10.
|
||||
- **server_port**: The service port of the gRPC server, with a default of 50051.
|
||||
|
||||
And then,proceed to start the gRPC server:
|
||||
|
||||
@@ -11,7 +11,7 @@ agentUniverse automatically registers SQLDBWrapper configuration files by scanni
|
||||
default = ['default_scan_path']
|
||||
sqldb_wrapper = ['sqldb_wrapper_scan_path']
|
||||
```
|
||||
By default, AgentUniverse scans all paths under either `default` or `sqldb_wrapper`sections in the configurtion, with paths under `sqldb_wrapper` having a higher priority than those under `default`.
|
||||
By default, AgentUniverse scans all paths under either `default` or `sqldb_wrapper`sections in the configuration, with paths under `sqldb_wrapper` having a higher priority than those under `default`.
|
||||
|
||||
|
||||
### Step Two: Configuration File
|
||||
|
||||
@@ -13,7 +13,7 @@ We will provide detailed descriptions of each component within the configuration
|
||||
### Setting the basic information of the agent.
|
||||
**`info` - basic information of the agent**
|
||||
* `name`: the name of the agent
|
||||
* `description`: a description of the agent's purpose or funtion
|
||||
* `description`: a description of the agent's purpose or function
|
||||
|
||||
### Setting the global configurations for the agent.
|
||||
**`profile` - Agent global settings.**
|
||||
|
||||
@@ -74,7 +74,7 @@ def as_langchain(self) -> BaseLanguageModel:
|
||||
In agentUniverse, the Model Channel (LLMChannel) class inherits from ComponentBase and includes the following configurable parameters:
|
||||
|
||||
1. `channel_name`: (Required) Corresponds to the aU channel component instance name
|
||||
2. `channel_api_key`: (Required) Corresponds to the specific channel's key, e.g., dashscope key for Baichuan platform; qianfan key for Qianfan platform
|
||||
2. `channel_api_key`: (Required) Corresponds to the specific channel's key, e.g., DASHSCOPE_API_KEY for Aliyun Dashscope platform; QIANFAN_API_KEY for Baidu Qianfan platform
|
||||
3. `channel_api_base`: (Required) Corresponds to the specific channel's endpoint
|
||||
4. `channel_organization`: (Optional) Corresponds to the specific channel's organization
|
||||
5. `channel_proxy`: (Optional) Corresponds to the specific channel's proxy
|
||||
@@ -111,7 +111,7 @@ meta_class: 'agentuniverse.llm.default.deep_seek_openai_style_llm.DefaultDeepSee
|
||||
|
||||
```yaml
|
||||
channel_name: 'deepseek-r1-official'
|
||||
channel_api_key: '${DEEPSEEK_OFFICIAL_CHANNEL_API_KEY}' # deepseek-r1 official website key
|
||||
channel_api_key: '${DEEPSEEK_API_KEY}' # deepseek-r1 official website key
|
||||
channel_api_base: 'https://api.deepseek.com/v1' # deepseek-r1 official website url
|
||||
channel_model_name: 'deepseek-reasoner' # deepseek-r1 official website model name
|
||||
model_support_stream: True # Whether deepseek-r1 official channel supports streaming
|
||||
@@ -142,7 +142,7 @@ meta_class: 'agentuniverse.llm.default.deep_seek_openai_style_llm.DefaultDeepSee
|
||||
|
||||
```yaml
|
||||
channel_name: 'deepseek-r1-dashscope'
|
||||
channel_api_key: '${DASHSCOPE_CHANNEL_API_KEY}' # Alibaba Cloud Baichuan platform key
|
||||
channel_api_key: '${DASHSCOPE_API_KEY}' # Alibaba Cloud Baichuan platform key
|
||||
channel_api_base: 'https://dashscope.aliyuncs.com/compatible-mode/v1' # Alibaba Cloud Baichuan platform url
|
||||
channel_model_name: deepseek-r1 # Alibaba Cloud Baichuan platform model name
|
||||
model_support_stream: True # Whether Alibaba Cloud Baichuan platform model supports streaming
|
||||
|
||||
@@ -7,7 +7,7 @@
|
||||
|
||||
## 环境与应用工程准备
|
||||
### 应用工程准备
|
||||
我们将**产品化模块样例**放在了agentUniverse的sample_standard_app工程中,你可以到 [这里](../../../../../examples/sample_standard_app/platform/difizen/product) 查看。这部分可以在后台通过yaml配置,当然通过产品页面可以自动创建和管理这些功能。
|
||||
我们将**产品化模块样例**放在了agentUniverse的sample_apps/difizen_app工程中,你可以到 [这里](../../../../../examples/sample_apps/difizen_app) 查看。这部分可以在后台通过yaml配置,当然通过产品页面可以自动创建和管理这些功能。
|
||||
|
||||
### 安装依赖
|
||||
**通过pip安装**
|
||||
@@ -32,7 +32,7 @@ product = ['sample_standard_app.platform.difizen.product']
|
||||
|
||||
## 使用agentUniverse产品化平台
|
||||
### 启动产品化服务
|
||||
运行sample_standard_app/bootstrap下的[product_application](../../../../../examples/sample_standard_app/bootstrap/platform/product_application.py)文件,一键启动。
|
||||
运行sample_standard_app/bootstrap下的[product_application](../../../../../examples/sample_apps/difizen_app/bootstrap/platform/product_application.py)文件,一键启动。
|
||||
|
||||

|
||||
|
||||
|
||||
@@ -74,7 +74,7 @@ def as_langchain(self) -> BaseLanguageModel:
|
||||
在agentUniverse中,模型通道(LLMChannel)类继承自ComponentBase,包含以下可配置参数:
|
||||
|
||||
1. `channel_name`:(必填)对应aU的通道组件实例名称
|
||||
2. `channel_api_key`:(必填)对应特定通道的密钥,比如百炼平台对应dashscope key;千帆平台对应qianfan key
|
||||
2. `channel_api_key`:(必填)对应特定通道的密钥,比如百炼平台对应DASHSCOPE_API_KEY;千帆平台对应QIANFAN_API_KEY
|
||||
3. `channel_api_base`:(必填)对应特定通道的endpoint
|
||||
4. `channel_organization`:(非必填)对应特定通道的organization
|
||||
5. `channel_proxy`:(非必填)对应特定通道的proxy
|
||||
@@ -111,7 +111,7 @@ meta_class: 'agentuniverse.llm.default.deep_seek_openai_style_llm.DefaultDeepSee
|
||||
|
||||
```yaml
|
||||
channel_name: 'deepseek-r1-official'
|
||||
channel_api_key: '${DEEPSEEK_OFFICIAL_CHANNEL_API_KEY}' # deepseek-r1官网的密钥
|
||||
channel_api_key: '${DEEPSEEK_API_KEY}' # deepseek-r1官网的密钥
|
||||
channel_api_base: 'https://api.deepseek.com/v1' # deepseek-r1官网的url
|
||||
channel_model_name: 'deepseek-reasoner' # deepseek-r1官网的模型名称
|
||||
model_support_stream: True # deepseek-r1官网通道是否支持流式
|
||||
@@ -142,7 +142,7 @@ meta_class: 'agentuniverse.llm.default.deep_seek_openai_style_llm.DefaultDeepSee
|
||||
|
||||
```yaml
|
||||
channel_name: 'deepseek-r1-dashscope'
|
||||
channel_api_key: '${DASHSCOPE_CHANNEL_API_KEY}' # 阿里云百炼平台的密钥
|
||||
channel_api_key: '${DASHSCOPE_API_KEY}' # 阿里云百炼平台的密钥
|
||||
channel_api_base: 'https://dashscope.aliyuncs.com/compatible-mode/v1' # 阿里云百炼平台的url
|
||||
channel_model_name: deepseek-r1 # 阿里云百炼平台的模型名称
|
||||
model_support_stream: True # 阿里云百炼平台模型是否支持流式
|
||||
|
||||
@@ -54,7 +54,7 @@ class Store(ComponentBase):
|
||||
```
|
||||
- `_new_client`和`_new_async_client`用于创建数据库链接,在组件注册阶段会被添加到[post_fork](../../技术组件/服务化/Web_Server.md)执行列表中,保证创建的数据库连接在Gunicorn模式下的子进程中是独立的。
|
||||
- `query`函数是知识组件在查询时调用的函数,负责根据传入的Query实例在store中查找相关的内容并以document的形式返回
|
||||
- `Store`的还包括对`Docuemnt`类型数据的增删改查,作为知识存储的管理接口。
|
||||
- `Store`的还包括对`Document`类型数据的增删改查,作为知识存储的管理接口。
|
||||
|
||||
在编写完对应代码后,可以参考下面的yaml将你的Store注册为aU组件:
|
||||
```yaml
|
||||
|
||||
Reference in New Issue
Block a user