A package for using llm models with api
Project description
llmplus
Usage
from llmplus.ChatGot.api import API, Models
authorization_key = 'your authorization key from website headers'
api = API(authorization_key)
api.model = Models.GPT4()
answer = api.send('who is the manager of the real madrid?')
asnwer2 = api.send('who is the CEO?')
Models
Models.GPT4
Models.GPT3
Models.Claude2
Acsess to the chat history
API.json_data['messages']
Extracting code block set extract_code to the type of code block to return just the code block
asnwer2 = api.send('who is the CEO?', extract_code = "json")
'''
LLM output:
your output is here
```json
{"a": "ssdgs"}
Output with extract_code: {"a": "ssdgs"} '''
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
llmplus-0.0.8.tar.gz
(3.7 kB
view details)
Built Distribution
File details
Details for the file llmplus-0.0.8.tar.gz
.
File metadata
- Download URL: llmplus-0.0.8.tar.gz
- Upload date:
- Size: 3.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a777eedc8f7b42879c128fe9acfb9dcf53c0532c421bb78489f3ecdf4d212953 |
|
MD5 | 94bcd90aea54d69822f7a47a44a8a70e |
|
BLAKE2b-256 | 063b8aa6d0a09e25aaafa717e8492a35abad50b77e16cdd08f46f19d0d8d6385 |
File details
Details for the file llmplus-0.0.8-py3-none-any.whl
.
File metadata
- Download URL: llmplus-0.0.8-py3-none-any.whl
- Upload date:
- Size: 4.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d97799a7010e7873b57752ec4cda41a0c5062b81d2bf71cee9b186b5fc91a806 |
|
MD5 | 7749b2628e8954075c7c80f54370d920 |
|
BLAKE2b-256 | abcb8157c9cdffd62a06318ec7a2a6f448cecb4fa123d3143617555c8c329537 |