zhenxun_bot/zhenxun/services/llm/__init__.py
Rumio 7c153721f0
Some checks failed
检查bot是否运行正常 / bot check (push) Waiting to run
Sequential Lint and Type Check / ruff-call (push) Waiting to run
Sequential Lint and Type Check / pyright-call (push) Blocked by required conditions
Release Drafter / Update Release Draft (push) Waiting to run
Force Sync to Aliyun / sync (push) Waiting to run
Update Version / update-version (push) Waiting to run
CodeQL Code Security Analysis / Analyze (${{ matrix.language }}) (none, javascript-typescript) (push) Has been cancelled
CodeQL Code Security Analysis / Analyze (${{ matrix.language }}) (none, python) (push) Has been cancelled
♻️ refactor!: 重构LLM服务架构并统一Pydantic兼容性处理 (#2002)
* ♻️ refactor(pydantic): 提取 Pydantic 兼容函数到独立模块

* ♻️ refactor!(llm): 重构LLM服务,引入现代化工具和执行器架构

🏗️ **架构变更**
- 引入ToolProvider/ToolExecutable协议,取代ToolRegistry
- 新增LLMToolExecutor,分离工具调用逻辑
- 新增BaseMemory抽象,解耦会话状态管理

🔄 **API重构**
- 移除:analyze, analyze_multimodal, pipeline_chat
- 新增:generate_structured, run_with_tools
- 重构:chat, search, code变为无状态调用

🛠️ **工具系统**
- 新增@function_tool装饰器
- 统一工具定义到ToolExecutable协议
- 移除MCP工具系统和mcp_tools.json

---------

Co-authored-by: webjoin111 <455457521@qq.com>
2025-08-04 23:36:12 +08:00

96 lines
2.0 KiB
Python

"""
LLM 服务模块 - 公共 API 入口
提供统一的 AI 服务调用接口、核心类型定义和模型管理功能。
"""
from .api import (
chat,
code,
embed,
generate,
generate_structured,
run_with_tools,
search,
)
from .config import (
CommonOverrides,
LLMGenerationConfig,
register_llm_configs,
)
register_llm_configs()
from .api import ModelName
from .manager import (
clear_model_cache,
get_cache_stats,
get_global_default_model_name,
get_model_instance,
list_available_models,
list_embedding_models,
list_model_identifiers,
set_global_default_model_name,
)
from .session import AI, AIConfig
from .tools import function_tool, tool_provider_manager
from .types import (
EmbeddingTaskType,
LLMContentPart,
LLMErrorCode,
LLMException,
LLMMessage,
LLMResponse,
ModelDetail,
ModelInfo,
ModelProvider,
ResponseFormat,
TaskType,
ToolCategory,
ToolMetadata,
UsageInfo,
)
from .utils import create_multimodal_message, message_to_unimessage, unimsg_to_llm_parts
__all__ = [
"AI",
"AIConfig",
"CommonOverrides",
"EmbeddingTaskType",
"LLMContentPart",
"LLMErrorCode",
"LLMException",
"LLMGenerationConfig",
"LLMMessage",
"LLMResponse",
"ModelDetail",
"ModelInfo",
"ModelName",
"ModelProvider",
"ResponseFormat",
"TaskType",
"ToolCategory",
"ToolMetadata",
"UsageInfo",
"chat",
"clear_model_cache",
"code",
"create_multimodal_message",
"embed",
"function_tool",
"generate",
"generate_structured",
"get_cache_stats",
"get_global_default_model_name",
"get_model_instance",
"list_available_models",
"list_embedding_models",
"list_model_identifiers",
"message_to_unimessage",
"register_llm_configs",
"run_with_tools",
"search",
"set_global_default_model_name",
"tool_provider_manager",
"unimsg_to_llm_parts",
]