Skip to main content

SilkLoom Core: minimal stateful batch engine for LLM and VLM workloads

Project description

SilkLoom Core

中文 | English

SilkLoom Core V1.0.0 is a minimal, stateful LLM batch engine.

The public surface is intentionally small:

  • LLMTask
  • ResultSet
  • TaskResult

This README is split into two parts:

  1. User Guide: installation, input formats, prompt rules, and examples.
  2. API Reference: constructor arguments, method signatures, and returned objects.

Prompt templates use strict Jinja2 syntax. user_prompt and system_prompt render against each input item, so template variables must match the keys in that item's dictionary. Missing variables raise an error instead of rendering as empty text. For a plain string list, SilkLoom wraps each item as {"text": "..."}.

Install

pip install silkloom-core

From source:

git clone https://github.com/LeLiu-GeoAI/silkloom-core.git
cd silkloom-core
pip install -e .

User Guide

Quick Start

from openai import OpenAI
from silkloom_core import LLMTask

client = OpenAI(api_key="your_key")

task = LLMTask(
    model="gpt-4o-mini",
    user_prompt="Translate into English: {{ text }}",
    client=client,
)

results = task.map(["你好", "今天天气不错"])
print(results[0])
print(results.success_count, results.failed_count)

Input Formats

LLMTask.map() accepts three common input shapes:

  • list[str]: each string is wrapped as {"text": ...}
  • list[dict]: each dict becomes one prompt context
  • pandas.DataFrame: each row becomes one prompt context and the column names become template variables

Dictionary list example:

from silkloom_core import LLMTask

task = LLMTask(
    model="gpt-4o-mini",
    user_prompt="Extract name and intent from text: {{ text }}",
)

results = task.map([
    {"text": "My name is Alice. I want a refund."},
    {"text": "Bob asks about delivery."},
])

Pandas DataFrame

Each DataFrame row is treated as one input item, and the column names are available as template variables.

import pandas as pd
from silkloom_core import LLMTask

df = pd.DataFrame(
    [
        {"text": "Urban heat island is intensifying.", "lang": "en"},
        {"text": "城市更新需要兼顾公平。", "lang": "zh"},
    ]
)

task = LLMTask(
    model="gpt-4o-mini",
    user_prompt="Rewrite the following {{ lang }} text: {{ text }}",
)

results = task.map(df)

Prompt Template Rules

Template variables must match the keys in the input context.

task = LLMTask(
    model="gpt-4o-mini",
    user_prompt="Rewrite the following {{ lang }} text: {{ text }}",
)

For a DataFrame, this row exposes text and lang to the template:

{"text": "Urban heat is rising.", "lang": "en"}

Structured Output

from pydantic import BaseModel
from silkloom_core import LLMTask


class ExtractInfo(BaseModel):
    name: str
    intent: str


task = LLMTask(
    model="gpt-4o-mini",
    user_prompt="Extract name and intent from text: {{ text }}",
    response_model=ExtractInfo,
)

results = task.map([
    {"text": "My name is Alice. I want a refund."},
    {"text": "Bob asks about delivery."},
])

print(results[0].name)

GLM and Ollama

GLM-4-Flash

import os
from openai import OpenAI
from silkloom_core import LLMTask

glm_client = OpenAI(
    api_key=os.environ["ZHIPUAI_API_KEY"],
    base_url="https://open.bigmodel.cn/api/paas/v4/",
)

task = LLMTask(
    model="glm-4-flash",
    user_prompt="Summarize this text: {{ text }}",
    client=glm_client,
)

results = task.map(["Urban renewal should balance efficiency and equity."])

Ollama

from openai import OpenAI
from silkloom_core import LLMTask

ollama_client = OpenAI(
    api_key="ollama",
    base_url="http://localhost:11434/v1",
)

task = LLMTask(
    model="qwen2.5:7b",
    user_prompt="Rewrite in academic tone: {{ text }}",
    client=ollama_client,
)

results = task.map(["Traffic is usually worst in evening peak."])

Multimodal Input

Pass image sources in images (supports local path, URL, or base64/data URI):

from silkloom_core import LLMTask

task = LLMTask(
    model="gpt-4o",
    user_prompt="Describe these images and answer: {{ text }}",
)

results = task.map([
    {
        "text": "What is shown?",
        "images": ["./pic1.jpg", "https://example.com/pic2.png"],
    }
])

Resumability

map supports resumability with SQLite via db_path + run_id:

results = task.map(
    [{"text": "a"}, {"text": "b"}],
    db_path="my_run.db",
    run_id="demo_001",
    workers=5,
)

Running again with the same run_id reuses successful records.

Exporting Results

ResultSet supports in-memory access and file export:

results.run_id
results.success_count
results.failed_count
results.total_tokens
results.errors
results[0]
results.export_jsonl("out.jsonl")
results.export_csv("out.csv", flatten=True)

API Reference

LLMTask

Constructor:

LLMTask(
    model: str,
    user_prompt: str,
    system_prompt: str | None = None,
    response_model: type[BaseModel] | None = None,
    max_retries: int = 3,
    client: Any | None = None,
)

Arguments:

  • model: target model name, such as gpt-4o-mini
  • user_prompt: required Jinja2 template for the user message
  • system_prompt: optional Jinja2 template for the system message
  • response_model: optional Pydantic model for structured output parsing
  • max_retries: number of attempts for one item
  • client: optional OpenAI-compatible client; defaults to the official client

Method:

map(sequence, db_path=".silkloom_cache.db", run_id=None, workers=5) -> ResultSet

Accepted inputs:

  • list[str]
  • list[dict]
  • pandas.DataFrame

ResultSet

ResultSet behaves like a sequence aligned with the input order.

Properties:

  • run_id
  • success_count
  • failed_count
  • total_tokens
  • errors

Methods:

  • results[0]: returns the result at the same index as the input
  • export_jsonl(path): write successful results to JSONL
  • export_csv(path, flatten=False, include_usage=True): write a CSV export

TaskResult

Each raw task result contains:

  • is_success
  • data
  • error
  • usage
  • input_data

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

silkloom_core-1.0.0.tar.gz (12.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

silkloom_core-1.0.0-py3-none-any.whl (11.9 kB view details)

Uploaded Python 3

File details

Details for the file silkloom_core-1.0.0.tar.gz.

File metadata

  • Download URL: silkloom_core-1.0.0.tar.gz
  • Upload date:
  • Size: 12.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for silkloom_core-1.0.0.tar.gz
Algorithm Hash digest
SHA256 dc3c8d1e72c056918fe2f89c3b6b2bd05dc0d2b2dcea94970aeb769a51decebd
MD5 531a1e03649d23fae2bb078a908ccaae
BLAKE2b-256 466c8d1dcb314951e723850af5c51cc29fe088e261179a1e63a76079af05f883

See more details on using hashes here.

Provenance

The following attestation bundles were made for silkloom_core-1.0.0.tar.gz:

Publisher: publish.yml on LeLiu-GeoAI/silkloom-core

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file silkloom_core-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: silkloom_core-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 11.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for silkloom_core-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 90eebe0ce3c74f8f7d1564e4af1f086c969ccf628345a8348ca07179020f7bc2
MD5 a6793efa9c0c6dd26e03661c1bb59f43
BLAKE2b-256 5d254510fff5756315c60705b5e136ccf1035974c472f3b9640ca9d27e1ae508

See more details on using hashes here.

Provenance

The following attestation bundles were made for silkloom_core-1.0.0-py3-none-any.whl:

Publisher: publish.yml on LeLiu-GeoAI/silkloom-core

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page