Skip to main content

The official library for the .txt platform

Project description

dottxt Python Library

The .txt Python library provides access to the .txt REST API from Python 3.10+ applications.

Request API access here.

It provides two client surfaces:

  • DotTxt for sync access with DotTxt helpers and OpenAI-compatible namespaces
  • AsyncDotTxt for async access with the same helper semantics

The API uses:

  • base URL: https://api.dottxt.ai/v1
  • auth: Authorization: Bearer $DOTTXT_API_KEY
  • primary endpoints: GET /models and POST /chat/completions

Install

pip install dottxt

Configure

export DOTTXT_API_KEY="your-api-key"

All clients in this package read DOTTXT_API_KEY by default.

Optional overrides:

export DOTTXT_BASE_URL="https://api.dottxt.ai/v1"
export DOTTXT_MODEL="<model-id>"

CLI

Use the dottxt CLI for login, model discovery, and one-off generation.

Client Surfaces

Choose the client that matches the shape you want to work with:

  • DotTxt.generate(...) and AsyncDotTxt.generate(...) accept JSON Schema as a string/object, plus any typed schema supported by Pydantic TypeAdapter (for example: Pydantic models, Enums, Literals, Unions, Optionals, and typed containers), and objects exposing to_json() -> str (for example Genson). Root {"type":"structural-tag", ...} schema objects are accepted without JSON Schema metaschema validation. They return a validated Pydantic model instance for Pydantic input, or parsed JSON for the other schema input types.
  • DotTxt and AsyncDotTxt also expose OpenAI SDK chat and models namespaces for direct SDK access alongside DotTxt helpers.

For constructor kwargs passed through to the OpenAI SDK client (DotTxt(..., **client_kwargs) / AsyncDotTxt(..., **client_kwargs)), see OpenAI Python base client parameters.

Native DotTxt Client

from typing import Literal

from pydantic import BaseModel, Field

from dottxt import DotTxt


class IncidentSummary(BaseModel):
    severity: Literal["low", "medium", "high"]
    team: str = Field(max_length=32)


client = DotTxt()

result = client.generate(
    model="openai/gpt-oss-20b",
    input="Summarize this incident: checkout errors are blocking purchases.",
    response_format=IncidentSummary,
)
print(result)
# Example model output:
# severity='high' team='checkout'
print(result.model_dump())
# Example output:
# {'severity': 'high', 'team': 'checkout'}

models = client.models.list()
print([model.id for model in models.data])
# Example output:
# ['openai/gpt-oss-20b', 'openai/gpt-4.1-mini']

Async Native Client

import asyncio
from typing import Literal

from pydantic import BaseModel, Field

from dottxt import AsyncDotTxt


class IncidentSummary(BaseModel):
    severity: Literal["low", "medium", "high"]
    team: str = Field(max_length=32)


async def main() -> None:
    client = AsyncDotTxt()
    result = await client.generate(
        model="openai/gpt-oss-20b",
        input="Summarize this incident: checkout errors are blocking purchases.",
        response_format=IncidentSummary,
    )
    print(result)
    # Example model output:
    # severity='high' team='checkout'
    print(result.model_dump())
    # Example output:
    # {'severity': 'high', 'team': 'checkout'}

    models = await client.models.list()
    print([model.id for model in models.data])
    # Example output:
    # ['openai/gpt-oss-20b', 'openai/gpt-4.1-mini']


asyncio.run(main())

For DotTxt and AsyncDotTxt, generate(...) accepts response_format as:

  • a Pydantic model class
  • a TypedDict type
  • a dataclass type
  • an Enum class
  • a typing.Literal[...] type
  • a typing.Union[...] type
  • a typing.Optional[...] type
  • typed containers such as list[...], dict[...], tuple[...]
  • a JSON string containing JSON Schema
  • a JSON object (dict)
  • an object exposing to_json() -> str that returns JSON Schema

Notes:

  • Raw list instances as response_format are not supported.
  • Root {"type":"structural-tag", ...} schema objects bypass metaschema checks.

For direct chat.completions.create(...), pass the wrapped OpenAI-style response_format payload yourself.

Use DotTxt.models.list() and AsyncDotTxt.models.list() for model listing.

OpenAI-Compatible Usage

Use DotTxt when you want an OpenAI-style client surface with chat.completions.create(...) and models.list().

from typing import Literal

from pydantic import BaseModel, Field

from dottxt import DotTxt as OpenAI


class IncidentSummary(BaseModel):
    severity: Literal["low", "medium", "high"]
    team: str = Field(max_length=32)


client = OpenAI()

completion = client.chat.completions.create(
    model="openai/gpt-oss-20b",
    messages=[
        {
            "role": "user",
            "content": "Summarize this incident: checkout errors are blocking purchases.",
        }
    ],
    response_format={
        "type": "json_schema",
        "json_schema": {
            "name": "incident_summary",
            "schema": IncidentSummary.model_json_schema(),
        },
    },
)
print(completion.choices[0].message.content)
# Example output:
# {"severity":"high","team":"checkout"}

models = client.models.list()
print([model.id for model in models.data])
# Example output:
# ['openai/gpt-oss-20b', 'openai/gpt-4.1-mini']

The compatibility surface expects the wrapped OpenAI-style response_format payload:

  • {"type": "json_schema", "json_schema": {...}}

Examples

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dottxt-0.2.0.tar.gz (106.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dottxt-0.2.0-py3-none-any.whl (17.1 kB view details)

Uploaded Python 3

File details

Details for the file dottxt-0.2.0.tar.gz.

File metadata

  • Download URL: dottxt-0.2.0.tar.gz
  • Upload date:
  • Size: 106.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.13

File hashes

Hashes for dottxt-0.2.0.tar.gz
Algorithm Hash digest
SHA256 455dc4be87c14522dd0aad5f1cdbee8a8e73058cf3303af5470d2fdf27d0c5d1
MD5 7524bd4f4c95b52e82e90cdcf44db861
BLAKE2b-256 7a7ad9bd2e9f45f6b2add5ad80f5fd1769db26c9e226482472f0787726f9ce1c

See more details on using hashes here.

Provenance

The following attestation bundles were made for dottxt-0.2.0.tar.gz:

Publisher: publish.yml on dottxt-ai/dottxt-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file dottxt-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: dottxt-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 17.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.13

File hashes

Hashes for dottxt-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7fcad6b536a6c1b92bc6139cd57dca6b3081619a43b9f04547833c01abe0704c
MD5 dda26aca2a1855ebba8724515f9f1960
BLAKE2b-256 c4256724419766f311cd2f5dae2f9909adc26393cd51e7412ce2655a0650a9e4

See more details on using hashes here.

Provenance

The following attestation bundles were made for dottxt-0.2.0-py3-none-any.whl:

Publisher: publish.yml on dottxt-ai/dottxt-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page