SciTeX GenAI — modality-organised generative-AI provider abstraction (LLM, agents, image/audio/video stubs)
Project description
scitex-genai
Modality-organised generative-AI provider abstraction for scientific research.
Full Documentation · pip install scitex-genai
Problem and Solution
| # | Problem | Solution |
|---|---|---|
| 1 | Per-provider boilerplate — every project re-writes thin wrappers around openai, anthropic, google.genai, groq, etc., each with subtly different cost / streaming / history semantics. |
Unified GenAI factory — same call shape across OpenAI, Anthropic, Google, Groq, DeepSeek, Perplexity, Llama. Cost tracking, conversation history, and message formatting are provider-agnostic. |
| 2 | Modality fragmentation — generative AI is splintering by modality (text, agents, image, audio, video, embeddings, multimodal); ad-hoc namespaces age badly. | Modality-organised layout — scitex_genai.{llm,agent,image,audio,video,embed,multimodal} is the public top-level shape from day one. Reserved namespaces import successfully but raise NotImplementedError until features land, so import paths never need to migrate. |
| 3 | Heavy LLM SDKs leak into ML workflows — pulling in scikit-learn shouldn't pull openai and friends, and vice versa. |
Split package — classical / deep ML lives in scitex-ml; scitex-genai carries only generative-AI deps. |
| 4 | Future-proofing for litellm + Ollama — locking the public API to one provider SDK closes off cheap routing improvements. | Litellm-ready façade — the planned llm rewrite routes through litellm, giving 100+ providers with one OpenAI-compatible interface (Ollama is just model="ollama/llama3") without changing the GenAI(...) call surface. |
Installation
pip install scitex-genai # core (LLM providers)
pip install scitex-genai[agent] # + claude-agent-sdk (forthcoming `agent` submodule)
pip install scitex-genai[litellm] # + litellm router (preview)
pip install scitex-genai[ollama] # + local ollama
pip install scitex-genai[all] # everything available today
Through the umbrella: pip install scitex[genai]. Requires Python ≥ 3.10.
Quick Start
import scitex_genai
ai = scitex_genai.GenAI(model="gpt-4o-mini")
print(ai("Explain neural networks in one sentence."))
print("cost USD:", ai.cost)
# Switch backends without changing the call shape:
ai = scitex_genai.GenAI(model="claude-sonnet-4-6")
ai("Same call, different provider.")
For a runnable walk-through see examples/01_genai.ipynb.
Demo
A runnable provider walk-through (init GenAI, single completion, cost
summary, provider switch) lives in
examples/01_genai.ipynb. Each cell skips
gracefully when the relevant API key is unset.
flowchart LR
User[your code] -->|GenAI(model)| Factory[scitex_genai.llm.GenAI]
Factory -->|dispatch| OpenAI[OpenAI]
Factory --> Anthropic[Anthropic]
Factory --> Google[Google]
Factory --> Groq[Groq]
Factory --> DeepSeek[DeepSeek]
Factory --> Perplexity[Perplexity]
Factory --> Llama[Llama]
OpenAI -.->|tokens / cost| Tracker[BaseGenAI<br/>cost + history]
Anthropic -.-> Tracker
Google -.-> Tracker
Groq -.-> Tracker
Tracker --> Out[ai(...) · ai.cost · ai.history]
A second examples/example_genai.py runs the same flow as a script and
is wired into tests/examples/test_example_genai.py for CI smoke
coverage.
Architecture
scitex-genai is organised top-down by modality, not by provider:
scitex-python (umbrella)
└── scitex.genai ── thin sys.modules-aliasing shim
└── scitex_genai (this package)
├── llm/ provider factory ``GenAI``
│ ├── _BaseGenAI common interface
│ ├── _OpenAI / _Anthropic / _Google /
│ │ _Groq / _DeepSeek / _Perplexity / _Llama
│ ├── _PARAMS model catalogue
│ ├── _calc_cost token-cost accounting
│ └── _format_output_func text/markdown formatting
├── agent/ reserved (claude-agent-sdk wrapper planned)
├── image/ reserved
├── audio/ reserved
├── video/ reserved
├── embed/ reserved
└── multimodal/ reserved
Reserved modality namespaces import successfully but raise
NotImplementedError on attribute access, so the public import paths
are stable as features land. Provider SDKs (openai, anthropic,
google-genai, groq) are eager core dependencies today; a follow-up
will route llm/ through litellm
to demote them to optional and add Ollama out of the box.
Modality layout
| Submodule | Status | Notes |
|---|---|---|
scitex_genai.llm |
✅ implemented | Provider factory GenAI. Litellm-backed in a follow-up. |
scitex_genai.agent |
🔒 reserved | Wrapper over claude-agent-sdk and friends planned. |
scitex_genai.image |
🔒 reserved | Image generation / editing. |
scitex_genai.audio |
🔒 reserved | TTS / STT / music. |
scitex_genai.video |
🔒 reserved | Video generation. |
scitex_genai.embed |
🔒 reserved | Embeddings. |
scitex_genai.multimodal |
🔒 reserved | Any-to-any unified models. |
Reserved namespaces import successfully but raise NotImplementedError on attribute access — import paths are stable as features land.
4 Interfaces
Python API ⭐⭐⭐ (primary)
from scitex_genai import GenAI
ai = GenAI(model="gpt-4o-mini")
print(ai("..."))
print("cost USD:", ai.cost)
CLI ⭐ — none
scitex-genai ships no dedicated CLI. Drive completions from Python or use the umbrella scitex CLI.
MCP ⭐ — none
No MCP server in this package today. The umbrella surfaces LLM-related MCP tools separately.
Skills ⭐⭐
Skill index for AI agents lives at src/scitex_genai/_skills/scitex-genai/SKILL.md. Sub-skill llm.md documents the provider factory.
Part of SciTeX
scitex-genai is part of SciTeX. Install via the umbrella with pip install scitex[genai] to use as scitex.genai (Python).
import scitex
scitex.genai.GenAI # same object as scitex_genai.GenAI
scitex.genai.llm # same object as scitex_genai.llm
scitex.genai delegates to scitex_genai — they share the same API.
The SciTeX system follows the Four Freedoms for Research below, inspired by the Free Software Definition:
Four Freedoms for Research
- The freedom to run your research anywhere — your machine, your terms.
- The freedom to study how every step works — from raw data to final manuscript.
- The freedom to redistribute your workflows, not just your papers.
- The freedom to modify any module and share improvements with the community.
AGPL-3.0 — because we believe research infrastructure deserves the same freedoms as the software it runs on.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file scitex_genai-0.1.0.tar.gz.
File metadata
- Download URL: scitex_genai-0.1.0.tar.gz
- Upload date:
- Size: 8.6 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
57a49e0b6fe71227637ed76e00d99c6f0c5bdbd022bb3bf8c2a4ae6cd6574cbf
|
|
| MD5 |
10a9127d0fa656fbc2b3c4074a18e89b
|
|
| BLAKE2b-256 |
2798d2b7ff0cdc5a17839e6d3de06e8938f641604205bce8be82dd4774dcd4a5
|
Provenance
The following attestation bundles were made for scitex_genai-0.1.0.tar.gz:
Publisher:
publish-pypi.yml on ywatanabe1989/scitex-genai
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
scitex_genai-0.1.0.tar.gz -
Subject digest:
57a49e0b6fe71227637ed76e00d99c6f0c5bdbd022bb3bf8c2a4ae6cd6574cbf - Sigstore transparency entry: 1462426285
- Sigstore integration time:
-
Permalink:
ywatanabe1989/scitex-genai@167d3cce42220964c3b61008035f0c46cf122141 -
Branch / Tag:
refs/heads/develop - Owner: https://github.com/ywatanabe1989
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-pypi.yml@167d3cce42220964c3b61008035f0c46cf122141 -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file scitex_genai-0.1.0-py3-none-any.whl.
File metadata
- Download URL: scitex_genai-0.1.0-py3-none-any.whl
- Upload date:
- Size: 8.3 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b2c6b080fb6c2e3640b3e52246acbb5244feeb2c56bfb53af20e639b565a5a72
|
|
| MD5 |
b8e04738d8997977f4578a1934e2898e
|
|
| BLAKE2b-256 |
8135fe942b86837a4d8c807be6d427e5f63a6b23aac6e103d42916c03ed94986
|
Provenance
The following attestation bundles were made for scitex_genai-0.1.0-py3-none-any.whl:
Publisher:
publish-pypi.yml on ywatanabe1989/scitex-genai
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
scitex_genai-0.1.0-py3-none-any.whl -
Subject digest:
b2c6b080fb6c2e3640b3e52246acbb5244feeb2c56bfb53af20e639b565a5a72 - Sigstore transparency entry: 1462426322
- Sigstore integration time:
-
Permalink:
ywatanabe1989/scitex-genai@167d3cce42220964c3b61008035f0c46cf122141 -
Branch / Tag:
refs/heads/develop - Owner: https://github.com/ywatanabe1989
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-pypi.yml@167d3cce42220964c3b61008035f0c46cf122141 -
Trigger Event:
workflow_dispatch
-
Statement type: