The official library for the .txt platform
Project description
dottxt Python Library
The .txt Python library provides access to the .txt REST API from Python 3.10+ applications.
It provides two client surfaces:
DotTxtfor sync access with DotTxt helpers and OpenAI-compatible namespacesAsyncDotTxtfor async access with the same helper semantics
The API uses:
- base URL:
https://api.dottxt.ai/v1 - auth:
Authorization: Bearer $DOTTXT_API_KEY - primary endpoints:
GET /modelsandPOST /chat/completions
Install
pip install dottxt
Configure
export DOTTXT_API_KEY="your-api-key"
All clients in this package read DOTTXT_API_KEY by default.
Optional overrides:
export DOTTXT_BASE_URL="https://api.dottxt.ai/v1"
export DOTTXT_MODEL="<model-id>"
CLI
Use the dottxt CLI for login, model discovery, and one-off generation.
- CLI reference: docs/cli.md
- Client reference: docs/client.md
Client Surfaces
Choose the client that matches the shape you want to work with:
DotTxt.generate(...)andAsyncDotTxt.generate(...)accept JSON Schema as a string/object, plus any typed schema supported by PydanticTypeAdapter(for example: Pydantic models, Enums, Literals, Unions, Optionals, and typed containers), and objects exposingto_json() -> str(for example Genson). Root{"type":"structural-tag", ...}schema objects are accepted without JSON Schema metaschema validation. They return a validated Pydantic model instance for Pydantic input, or parsed JSON for the other schema input types.DotTxtandAsyncDotTxtalso expose OpenAI SDKchatandmodelsnamespaces for direct SDK access alongside DotTxt helpers.
For constructor kwargs passed through to the OpenAI SDK client
(DotTxt(..., **client_kwargs) / AsyncDotTxt(..., **client_kwargs)), see
OpenAI Python base client parameters.
Native DotTxt Client
from typing import Literal
from pydantic import BaseModel, Field
from dottxt import DotTxt
class IncidentSummary(BaseModel):
severity: Literal["low", "medium", "high"]
team: str = Field(max_length=32)
client = DotTxt()
result = client.generate(
model="openai/gpt-oss-20b",
input="Summarize this incident: checkout errors are blocking purchases.",
response_format=IncidentSummary,
)
print(result)
# Example model output:
# severity='high' team='checkout'
print(result.model_dump())
# Example output:
# {'severity': 'high', 'team': 'checkout'}
models = client.models.list()
print([model.id for model in models.data])
# Example output:
# ['openai/gpt-oss-20b', 'openai/gpt-4.1-mini']
Async Native Client
import asyncio
from typing import Literal
from pydantic import BaseModel, Field
from dottxt import AsyncDotTxt
class IncidentSummary(BaseModel):
severity: Literal["low", "medium", "high"]
team: str = Field(max_length=32)
async def main() -> None:
client = AsyncDotTxt()
result = await client.generate(
model="openai/gpt-oss-20b",
input="Summarize this incident: checkout errors are blocking purchases.",
response_format=IncidentSummary,
)
print(result)
# Example model output:
# severity='high' team='checkout'
print(result.model_dump())
# Example output:
# {'severity': 'high', 'team': 'checkout'}
models = await client.models.list()
print([model.id for model in models.data])
# Example output:
# ['openai/gpt-oss-20b', 'openai/gpt-4.1-mini']
asyncio.run(main())
For DotTxt and AsyncDotTxt, generate(...) accepts response_format as:
- a Pydantic model class
- a TypedDict type
- a dataclass type
- an Enum class
- a
typing.Literal[...]type - a
typing.Union[...]type - a
typing.Optional[...]type - typed containers such as
list[...],dict[...],tuple[...] - a JSON string containing JSON Schema
- a JSON object (
dict) - an object exposing
to_json() -> strthat returns JSON Schema
Notes:
- Raw list instances as
response_formatare not supported. - Root
{"type":"structural-tag", ...}schema objects bypass metaschema checks.
For direct chat.completions.create(...), pass the wrapped OpenAI-style
response_format payload yourself.
Use DotTxt.models.list() and AsyncDotTxt.models.list() for model listing.
OpenAI-Compatible Usage
Use DotTxt when you want an OpenAI-style client surface with
chat.completions.create(...) and models.list().
from typing import Literal
from pydantic import BaseModel, Field
from dottxt import DotTxt as OpenAI
class IncidentSummary(BaseModel):
severity: Literal["low", "medium", "high"]
team: str = Field(max_length=32)
client = OpenAI()
completion = client.chat.completions.create(
model="openai/gpt-oss-20b",
messages=[
{
"role": "user",
"content": "Summarize this incident: checkout errors are blocking purchases.",
}
],
response_format={
"type": "json_schema",
"json_schema": {
"name": "incident_summary",
"schema": IncidentSummary.model_json_schema(),
},
},
)
print(completion.choices[0].message.content)
# Example output:
# {"severity":"high","team":"checkout"}
models = client.models.list()
print([model.id for model in models.data])
# Example output:
# ['openai/gpt-oss-20b', 'openai/gpt-4.1-mini']
The compatibility surface expects the wrapped OpenAI-style
response_format payload:
{"type": "json_schema", "json_schema": {...}}
Examples
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file dottxt-0.2.0.tar.gz.
File metadata
- Download URL: dottxt-0.2.0.tar.gz
- Upload date:
- Size: 106.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
455dc4be87c14522dd0aad5f1cdbee8a8e73058cf3303af5470d2fdf27d0c5d1
|
|
| MD5 |
7524bd4f4c95b52e82e90cdcf44db861
|
|
| BLAKE2b-256 |
7a7ad9bd2e9f45f6b2add5ad80f5fd1769db26c9e226482472f0787726f9ce1c
|
Provenance
The following attestation bundles were made for dottxt-0.2.0.tar.gz:
Publisher:
publish.yml on dottxt-ai/dottxt-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
dottxt-0.2.0.tar.gz -
Subject digest:
455dc4be87c14522dd0aad5f1cdbee8a8e73058cf3303af5470d2fdf27d0c5d1 - Sigstore transparency entry: 1409713702
- Sigstore integration time:
-
Permalink:
dottxt-ai/dottxt-python@68fd25f796204640b7c33fa0c3639819f2be21c7 -
Branch / Tag:
refs/tags/v0.2.0 - Owner: https://github.com/dottxt-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@68fd25f796204640b7c33fa0c3639819f2be21c7 -
Trigger Event:
release
-
Statement type:
File details
Details for the file dottxt-0.2.0-py3-none-any.whl.
File metadata
- Download URL: dottxt-0.2.0-py3-none-any.whl
- Upload date:
- Size: 17.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7fcad6b536a6c1b92bc6139cd57dca6b3081619a43b9f04547833c01abe0704c
|
|
| MD5 |
dda26aca2a1855ebba8724515f9f1960
|
|
| BLAKE2b-256 |
c4256724419766f311cd2f5dae2f9909adc26393cd51e7412ce2655a0650a9e4
|
Provenance
The following attestation bundles were made for dottxt-0.2.0-py3-none-any.whl:
Publisher:
publish.yml on dottxt-ai/dottxt-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
dottxt-0.2.0-py3-none-any.whl -
Subject digest:
7fcad6b536a6c1b92bc6139cd57dca6b3081619a43b9f04547833c01abe0704c - Sigstore transparency entry: 1409713715
- Sigstore integration time:
-
Permalink:
dottxt-ai/dottxt-python@68fd25f796204640b7c33fa0c3639819f2be21c7 -
Branch / Tag:
refs/tags/v0.2.0 - Owner: https://github.com/dottxt-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@68fd25f796204640b7c33fa0c3639819f2be21c7 -
Trigger Event:
release
-
Statement type: