Skip to main content

llm utility package

Project description

Serapeum Core

Serapeum Core is the provider-agnostic foundation for the Serapeum ecosystem. It defines the core LLM abstractions, prompt templates, structured-output parsers, and tool-execution utilities used by integration packages (OpenAI, Ollama, etc.).

If you are implementing a new LLM backend or want a consistent set of models and helpers for prompts, output parsing, and tool calling, this is the package to use.

Highlights

  • Unified LLM interfaces for chat/completions, streaming, and async variants.
  • Prompt templates for string and chat workflows with variable mapping utilities.
  • Structured output parsing with Pydantic models and JSON schema hints.
  • Tool metadata, schema generation, and safe tool execution utilities.
  • Orchestrators for tool calling and structured outputs.

Package layout

  • serapeum.core.base.llms: Base interfaces and data models (messages, chunks, responses, metadata).
  • serapeum.core.llms: High-level LLM class with predict/stream helpers and structured output utilities.
  • serapeum.core.prompts: Prompt templates for text and chat models.
  • serapeum.core.output_parsers: Parsers for structured outputs (e.g., Pydantic).
  • serapeum.core.tools: Tool metadata, tool interfaces, and tool execution.
  • serapeum.core.llms.orchestrators: Higher-level programs for structured outputs and function-calling orchestration.
  • serapeum.core.utils: JSON/schema helpers and async utilities.
  • serapeum.core.configs: Global configuration singleton (Configs).
  • serapeum.core.chat: Agent response container with streaming helpers.

Installation

From the repo:

cd serapeum-core
python -m pip install -e .

Python 3.11+ is required.

Quick start

1) Build a minimal LLM implementation

from serapeum.core.llms import LLM, CompletionResponse, Metadata
from serapeum.core.prompts import PromptTemplate


class EchoLLM(LLM):
  metadata = Metadata.model_construct(is_chat_model=False)

  def chat(self, messages, stream=False, **kwargs):
    raise NotImplementedError()

  async def achat(self, messages, stream=False, **kwargs):
    raise NotImplementedError()

  def complete(self, prompt, formatted=False, stream=False, **kwargs):
    if stream:
      raise NotImplementedError()
    return CompletionResponse(text=prompt, delta=prompt)

  async def acomplete(self, prompt, formatted=False, stream=False, **kwargs):
    if stream:
      raise NotImplementedError()
    return CompletionResponse(text=prompt, delta=prompt)


llm = EchoLLM()
prompt = PromptTemplate("Hello, {name}!")
result = llm.predict(prompt, name="Serapeum")
print(result)

2) Parse structured outputs with Pydantic

from pydantic import BaseModel
from serapeum.core.output_parsers import PydanticParser
from serapeum.core.prompts import PromptTemplate


class Greeting(BaseModel):
    message: str


parser = PydanticParser(output_cls=Greeting)
prompt = PromptTemplate(
    'Return JSON like {"message": "<text>"}. Text: {text}',
    output_parser=parser,
)

# With a real LLM backend, this returns a validated Greeting model.
# result = llm.predict(prompt, text="Hello")

3) Define tools and execute them safely

from serapeum.core.tools import BaseTool, ToolMetadata, ToolOutput, ToolExecutor


class EchoTool(BaseTool):
    @property
    def metadata(self) -> ToolMetadata:
        return ToolMetadata(description="Echo input", name="echo")

    def __call__(self, input_values: dict) -> ToolOutput:
        return ToolOutput(tool_name="echo", content=input_values.get("input", ""))


tool = EchoTool()
executor = ToolExecutor()
output = executor.execute(tool, {"input": "hi"})
print(output.content)

Core concepts

LLM interface and orchestration

  • BaseLLM defines the provider contract for chat/completion endpoints, streaming, and async variants.
  • LLM builds on BaseLLM with helpers for prompt formatting, prediction, and structured output modes.
  • FunctionCallingLLM extends LLM with tool-calling helper methods.

Prompts

Use PromptTemplate for string prompts and ChatPromptTemplate for message templates. Both support template variable mappings and optional output parsers.

Output parsers

PydanticParser injects a compact JSON schema into prompts and parses LLM output into validated Pydantic models. output_parsers.utils also provides markdown JSON/code extraction helpers.

Tools and schemas

  • ToolMetadata produces OpenAI-style tool specs and JSON schema for tool inputs.
  • CallableTool (in serapeum.core.tools.callable_tool) can wrap functions or Pydantic models into tool definitions.
  • ToolExecutor runs tools safely with optional auto-unpacking and error normalization.

Structured programs

  • TextCompletionLLM runs a prompt + parser + LLM pipeline to return Pydantic outputs for completion-style models.
  • ToolOrchestratingLLM uses function-calling models to produce structured outputs via tools.

Configuration

serapeum.core.configs.Configs is a small global configuration holder. You can set a default LLM instance and control structured output mode:

from serapeum.core.configs import Configs
from serapeum.core.types import StructuredOutputMode


Configs.llm = llm
Configs.structured_output_mode = StructuredOutputMode.OPENAI

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

serapeum_core-0.3.0.tar.gz (93.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

serapeum_core-0.3.0-py3-none-any.whl (112.3 kB view details)

Uploaded Python 3

File details

Details for the file serapeum_core-0.3.0.tar.gz.

File metadata

  • Download URL: serapeum_core-0.3.0.tar.gz
  • Upload date:
  • Size: 93.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for serapeum_core-0.3.0.tar.gz
Algorithm Hash digest
SHA256 5c66a1973e569a875bca414a23815baee82476ce093be534d001552262e6d9f7
MD5 662697dda999f22dded4b78b91b239d1
BLAKE2b-256 398fec7ed457cf7494dd2e670d510d77cf3e083f79facb4f599fb4ced604c03c

See more details on using hashes here.

File details

Details for the file serapeum_core-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: serapeum_core-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 112.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for serapeum_core-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a27e85f0ce103f26be69a46a00a8cc85282074be09c38485e6d35b1dd97934fb
MD5 04515ac2646c768d35a3c9c5bbe2028b
BLAKE2b-256 b11534def32efd4ab0bd4406d1a9880427f593023e5e7cfe362461cfa001af2c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page