Interact with Large Language Models.
Project description
cmem-plugin-llm
Interact with Large Language Models.
This is a plugin for eccenca Corporate Memory. You can install it with the cmemc command line clients like this:
cmemc admin workspace python install cmem-plugin-llm
Create Embedding
This plugin contain the following capabilitities:
- create embeddings: it allows to create embeddings from an arbitrary data point or from specified data point paths (properties/columns).
After being processed each data point receive two additional paths the
_embeddingpath and the_embedding_sourcepath.- The
_embeddingpath contain the generated embedding. - The
_embedding_sourcepath contain the paths utilized for generate the embeddings in text format.
- The
Parameters
url: openAI compatible endpoint, defaulthttps://api.openai.com/v1model: embedding model, defaulttext-embedding-3-smallapi_key: api key of the endpoint, default blanktimout_single_request: the request timenout in milliseconds, default10000entries_processing_buffer: number of processed entries per request, default1000embedding_paths: specify which paths should be used for embedding generation, default allembedding_output_text: output path that will contain the embedding text, default_embedding_sourceembedding_output_path: output path that will contain the generated embedding, default_embedding
Execute Instruction
This plugin contain the following capabilitities:
- execute instruction: it allows to execute an LLM instruction over a given list of entities.
After being processed each entity receive one additional path, the
_instruction_output.- The
_instruction_outputpath contains the output of the executed instruction over the entitiy.
- The
Parameters
url: openAI compatible endpoint, defaulthttps://api.openai.com/v1model: embedding model, defaultgpt-oapi_key: api key of the endpoint, default blanktimout_single_request: the request timenout in milliseconds, default10000instruction_template: the instruction template default instruct template:Write a paragraph about this entity: ${entity}prompt_template: the prompt template default prompt template:[{ "role": "developer", "content": "You are a helpful assistant." }, { "role": "user", "content": "${instruct}" }]instruct_output_path: output path that will contain the output of the executed instruction, default_instruction_output
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file cmem_plugin_llm-0.6.0.tar.gz.
File metadata
- Download URL: cmem_plugin_llm-0.6.0.tar.gz
- Upload date:
- Size: 15.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.0.1 CPython/3.12.9 Darwin/24.3.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9040e63d4974260f5422777b2564fe36f4c3aff3ff309f0154c2d98315ea0af9
|
|
| MD5 |
64e95b8db64f261d6f136a330dfbfb77
|
|
| BLAKE2b-256 |
759b044fed67b9dcf87480e2ea3a3cbf7e548a5f394df67f77825f793ae72e31
|
File details
Details for the file cmem_plugin_llm-0.6.0-py3-none-any.whl.
File metadata
- Download URL: cmem_plugin_llm-0.6.0-py3-none-any.whl
- Upload date:
- Size: 17.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.0.1 CPython/3.12.9 Darwin/24.3.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b79b42b8ef9cc01692f4f252a71784e5903efe886801802244b4ec6c69f03cbf
|
|
| MD5 |
c0929968adf15f3a878fa48ab462b85f
|
|
| BLAKE2b-256 |
57496acaa35dc08722f2154a02b36da3255fd8352c47960cdd1c833581ed165e
|