No project description provided
Project description
ReasoningChain
给接口增加本地缓存
from reasoningchain.cache.disk_cache import disk_cache
@disk_cache(cache_path=os.path.join(os.environ["HOME"], "some_path/some_name"), expire_time=864000)
def foobar(key:str):
"""do something"""
pass
获取文本向量
from reasoningchain.api.closeai import batch_get_embeddings
embeddings = batch_get_embeddings(["hello", "world"], batch_size=16)
构建文本向量索引
from reasoningchain.index.doc_index import DocIndex
doc_index = DocIndex()
doc_index.build(doc_full_text) # 构建索引
doc_index.save(index_file_path) # 保存索引到文件
doc_index.load(index_file_path) # 从文件加载索引
results = doc_index.search(query) # 查询索引
自定义langchain Tools
from reasoningchain.custom_tools import custom_tool
from reasoningchain.custom_tools import get_all_tool_names
from reasoningchain.custom_tools import get_all_custom_tool_names
from reasoningchain.custom_tools import load_tools
# 增加自定义tool
@custom_tool(
name = "{{Tool Name}}",
description = "{{Tool Descriptions}}"
)
def tool_func(input_text:str, callback:callable=None) -> str:
"""do something"""
pass
# 获取所有自定义的tool names
all_custom_tool_names = get_all_custom_tool_names()
# 获取所有tool names(包括自定义的tool 和 langchain中预定义的tool)
all_tool_names = get_all_tool_names()
# 加载tools
tools = load_tools(["BaiduSearchText", "GoogleSearchImage", "wikipedia"])
运行chain
- 代码中调用:
import reasoningchain
final_answer = reasoningchain.run("介绍一下小度", tool_names=["BaiduSearchText"])
print(f"Final Answer:{final_answer}")
- 命令行:
# 单query
reasoningchain --tools "BaiduSearchText" --query "马斯克是谁?"
# 批量处理
cat queries.txt | reasoningchain --tools "BaiduSearchText"
- 启动WebUI服务:
reasoningchainui --port 8502
参数配置
# 通过环境变量设置参数
# 使用BaiduSearchText时需设置
os.environ['BAIDU_SEARCH_API'] = 'https://m.baidu.com/...'
# 使用openai相关接口时需设置
os.environ['OPENAI_API_KEY'] = '123'
# 需要对openai做代理时可设置
os.environ['OPENAI_API_BASE'] = '代理地址'
# 使用serpapi时需设置,包括google的搜索API
os.environ['SERPAPI_API_KEY'] = 'SERP API-KEY'
# 使用WOLFRAM ALPHA Tool时需设置
os.environ['WOLFRAM_ALPHA_APPID'] = 'walfram-alpha appid'
# 使用DuLLM时需设置
os.environ['DU_LLM_API'] = 'http://...' # 内部自定义LLM的API
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
reasoningchain-0.0.24.tar.gz
(23.3 kB
view details)
File details
Details for the file reasoningchain-0.0.24.tar.gz
.
File metadata
- Download URL: reasoningchain-0.0.24.tar.gz
- Upload date:
- Size: 23.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
4514b9adde5624b2bc585bc2e394e6074ab61a3c1a045996666e727cc1e42ea1
|
|
MD5 |
5d7a9c33c1503514a73e95a79b27d02d
|
|
BLAKE2b-256 |
b8e9ac60a6a9e04e04294a2f0f6beee23fa400d470a74cbda26a6ac0d49e8458
|