No project description provided
Project description
Kubeagi Core
Kubeagi Core including some common functions, these functions are designed to be as modular and simple as possible.
Quick Start
There are several ways to use the kubeagi_core
library:
- Install the library
Installing the library
Use the following instructions to get up and running with kubeagi_core
and test your
installation.
- Install the Python SDK
pip install kubeagi_core
At this point, you should be able to run the following code:
from kubeagi_core.qa_provider.openai import QAProviderOpenAI
qa_provider = QAProviderOpenAI(
api_key="fake",
base_url="http://fastchat-api.172.22.95.167.nip.io/v1",
model="f8e35823-3841-4253-ae79-0fff47917ae3",
)
data = qa_provider.generate_qa_list(text="大语言模型(LLM)是指使用大量文本数据训练的深度学习模型,可以生成自然语言文本或理解语言文本的含义。大语言模型可以处理多种自然语言任务,如文本分类、问答、对话等,是通向人工智能的一条重要途径。")
print(data)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
kubeagi_core-0.0.1.tar.gz
(7.9 kB
view hashes)
Built Distribution
Close
Hashes for kubeagi_core-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 05cd5725b822ad56ca3c02c5fd350f1f622505090883eedf18bcc0cc7315abab |
|
MD5 | ad3103b80139d91974d7426cc8820ff1 |
|
BLAKE2b-256 | 0d5a7919fa831668e1c4eddaa9b2d535812452e07938966763ff5540f66f87f6 |