Skip to main content

llm api and prompt

Project description

Introduction

Install package

pip install llm_cat

Example Usage:

# deepseek
message = "What is the capital of France?"
token ="sk-xxxxxxxxxxxxx"
result = deepseek_chat(message,token)
print(result)


# ollama
message = "What is the capital of France?"
response = ollama_chat(message,model='llama3.1',url = 'http://localhost:11434/api/chat')
print(response)

Template

from jinja2 import Template

prompt_template = "Hello {{ name }}!"
template = Template(prompt_template)
print(template.render(name="John Doe"))

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_cat-0.0.1.tar.gz (3.9 kB view details)

Uploaded Source

File details

Details for the file llm_cat-0.0.1.tar.gz.

File metadata

  • Download URL: llm_cat-0.0.1.tar.gz
  • Upload date:
  • Size: 3.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.11

File hashes

Hashes for llm_cat-0.0.1.tar.gz
Algorithm Hash digest
SHA256 bb6c7d47873192be54fea7bef33bd209bfab726313dc0119755eea817042f1da
MD5 f223c0739aef0a2184fe2a94fb141557
BLAKE2b-256 6009e63f51ced7b9fe6a18e3435eb1641166902347ba43246e696067fe9674ab

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page