No project description provided
Project description
LLM JSON Adapter
What is it ?
When using LLMs from the system, you often expect to get output results in JSON: OpenAPI's GPT API has a mechanism called Function Calling, which can return JSON, but Google's Gemini does not seem to have that functionality.
Therefore, I have created a wrapper library to switch LLMs and get results in JSON. What this library can do is as follows.
- Allows you to define the results you want to get in JSON Schema
- Switch between LLMs (currently supports OpenAI's GPT, Google's Gemini, Ollama and Bedrock for Llama and Anthropic Claude).
- Retry a specified number of times if the JSON retrieval fails
How to use
Use the following code to get the results in JSON.
Parameter | Description |
---|---|
provider_name | The name of the LLM provider to use. Currently, only "google" and "openai" are supported. |
max_retry_count | The number of times to retry if the JSON retrieval fails. |
attributes | The attributes to pass to the LLM provider. |
OpenAI
Libraries
You need to install openai
.
Attributes
Parameter | Description |
---|---|
api_key | The API key to use. |
model | Model name. Default: gpt-3.5-turbo |
temperature | Default: 0.67 |
presence_penalty | Default: 0 |
frequency_penalty | Default: 0 |
Example
from llm_json_adapter import LLMJsonAdapter, Response
adapter = LLMJsonAdapter(provider_name="openai", max_retry_count=3, attributes={
"api_key": "Your API Key",
"model": "gpt-3.5-turbo",
})
result = adapter.generate(
prompt="prompt",
language="en",
act_as="Professional Software Service Business Analyst",
function=Response(
name="response name",
description="response description",
parameters={
"type": "object",
"properties": {
"data": {
"type": "array",
"items": {
"type": "object",
"properties": {
"title": {
"type": "string",
},
"description": {
"type": "string",
},
},
"required": ["title", "description"],
},
},
},
"required": ["data"]
},
)
)
Gemini
Libraries
You need to install google-generativeai
.
Attributes
Parameter | Description |
---|---|
api_key | The API key to use. |
model | Model name( Default: gemini-1.5-pro-latest |
Example
from llm_json_adapter import LLMJsonAdapter, Response
adapter = LLMJsonAdapter(provider_name="google", max_retry_count=3, attributes={
"api_key": "Your API Key",
"model": "gemini-1.5-pro-latest",
})
result = adapter.generate(
prompt="prompt",
language="en",
act_as="Professional Software Service Business Analyst",
function=Response(
name="response name",
description="response description",
parameters={
"type": "object",
"properties": {
"data": {
"type": "array",
"items": {
"type": "object",
"properties": {
"title": {
"type": "string",
},
"description": {
"type": "string",
},
},
"required": ["title", "description"],
},
},
},
"required": ["data"]
},
)
)
Ollama
You need to prepare the Ollama server. ( https://ollama.com/ )
Libraries
You need to install ollama
.
Attributes
Parameter | Description |
---|---|
url | http://localhost:11434 |
model | Model name ( Default: llama3 ). |
Example
from llm_json_adapter import LLMJsonAdapter, Response
adapter = LLMJsonAdapter(provider_name="ollama", max_retry_count=3, attributes={
"url": "http://localhost:11434",
"model": "llama3",
})
result = adapter.generate(
prompt="prompt",
language="en",
act_as="Professional Software Service Business Analyst",
function=Response(
name="response name",
description="response description",
parameters={
"type": "object",
"properties": {
"data": {
"type": "array",
"items": {
"type": "object",
"properties": {
"title": {
"type": "string",
},
"description": {
"type": "string",
},
},
"required": ["title", "description"],
},
},
},
"required": ["data"]
},
)
)
Bedrock
You need to setup the AWS Bedrock( https://aws.amazon.com/bedrock/ )
Libraries
You need to install boto3
.
Attributes
Parameter | Description |
---|---|
access_key_id | The access key id to use. |
secret_access_key | The secret access key to use. |
region | Region. Default: us-east-1 |
model | Default: anthropic.claude-3-haiku-20240307-v1:0 |
max_tokens | Default: 1024 |
Example
from llm_json_adapter import LLMJsonAdapter, Response
adapter = LLMJsonAdapter(provider_name="bedrock", max_retry_count=3, attributes={
"access_key_id": "<YOUR AWS ACCESS KEY>",
"secret_access_key": "<YOUR AWS SECRET ACCESS KEY>",
"region": "us-east-1",
"model": "anthropic.claude-3-haiku-20240307-v1:0",
"max_tokens": 1024,
})
result = adapter.generate(
prompt="prompt",
language="en",
act_as="Professional Software Service Business Analyst",
function=Response(
name="response name",
description="response description",
parameters={
"type": "object",
"properties": {
"data": {
"type": "array",
"items": {
"type": "object",
"properties": {
"title": {
"type": "string",
},
"description": {
"type": "string",
},
},
"required": ["title", "description"],
},
},
},
"required": ["data"]
},
)
)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file llm_json_adapter-0.2.0.tar.gz
.
File metadata
- Download URL: llm_json_adapter-0.2.0.tar.gz
- Upload date:
- Size: 10.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.10.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ac0f6ad704757ee43f332b20b0be57982a4b0df5aaba7adb8411d485c6b4d32d |
|
MD5 | d9ce6aa066fac3e9341296ff0874f9f2 |
|
BLAKE2b-256 | ff0236fa27c92ba14ef98b13bd60464f837fec9219bb2647185b48553494134b |
File details
Details for the file llm_json_adapter-0.2.0-py3-none-any.whl
.
File metadata
- Download URL: llm_json_adapter-0.2.0-py3-none-any.whl
- Upload date:
- Size: 14.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.10.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | bb796e20ad125d7fa25c29bbdf9eb5ee08d9f570407f1e7df8e1149bbcb24253 |
|
MD5 | 0240454ad358aa3f2358934841f806c4 |
|
BLAKE2b-256 | 7dc525a3fc047c6d2c77713161d507b5221859e30730f69d406c835e1082f184 |