Skip to main content

CLI that turns raw transcripts into study-ready Markdown, DOCX, or LaTeX using OpenAI or Claude.

Project description

t2md

CI PyPI Python License: MIT

Turn raw transcripts into study-ready reading.

t2md is a command-line tool that takes a folder of transcripts (lecture captions, interview notes, Zoom dumps) and runs them through an LLM to produce a clean executive summary plus textbook-style prose. It auto-picks a cheap model for short inputs and a stronger one for long inputs, and it works with either OpenAI or Anthropic.


What it does

Input — raw transcript (examples/mit6_7960_lec01_intro_deep_learning/mit6_7960f24_lec01.txt):

MITOCW | mit6_7960f24_lec01.mp4
[SQUEAKING]
[RUSTLING]

SARA BEERY:
So why are we all here? Deep learning has clearly been exploding in society. Machine learning
generally is something that, when I started studying it about 13 years ago, didn't work, and
now it works. So how many of you here in this room used AI in the last week?
Yeah, almost everybody, probably everybody...

Output — generated markdown (full file):

# Executive Summary
- Thesis: Deep learning has rapidly evolved to become a transformative technology...
- Key Concepts: neural networks, differential programming, activation functions...
- What to Remember: ReLU is the default activation; transfer learning leverages...

# Reading
## Introduction to Deep Learning
Deep learning has gained significant traction over recent years...

## Historical Perspective on Neural Networks
The journey of neural networks is marked by cycles of enthusiasm and skepticism...

Two pre-generated samples are committed at examples/sample_outputs/ so you can see the output quality before running anything.


Install

pipx install t2md

Or with plain pip:

pip install t2md

For PDF input support:

pipx install "t2md[pdf]"

If you want to install from source:

brew install pipx && pipx ensurepath
pipx install git+https://github.com/rraj7/t2md.git

Verify:

t2md --help
t2md doctor

Setup

Export at least one API key:

export OPENAI_API_KEY="sk-..."          # or
export ANTHROPIC_API_KEY="sk-ant-..."

Reload your shell (source ~/.zshrc) and run t2md doctor to confirm.


Usage

# Basic run — auto-selects model by input size, writes Markdown to ./outputs
t2md run examples/mit6_7960_lec01_intro_deep_learning

# Word document output (openable in Word, Google Docs, Pages)
t2md run /path/to/transcripts --format docx

# LaTeX output for PDF-ready workflows
t2md run /path/to/transcripts --format tex

# Use Claude instead of OpenAI
t2md run /path/to/transcripts --provider anthropic

# Override the auto-selected model
t2md run /path/to/transcripts --model gpt-4o

# Use a built-in prompt preset (lecture or interview)
t2md run /path/to/transcripts --preset lecture
t2md run /path/to/transcripts --preset interview

# Custom prompt file — full control over the output style
t2md run /path/to/transcripts --prompt /path/to/prompt_rules.md

# Custom output directory
t2md run /path/to/transcripts --out ~/Documents/t2md_outputs

# Raise the output cap if the generated file looks truncated
t2md run /path/to/transcripts --max-output-tokens 32000

Automatic model selection

Without --model, t2md picks the cheapest model that can handle the input:

Input tokens OpenAI Anthropic
< 4,000 gpt-4o-mini claude-haiku-4-5
4,000 – 32,000 gpt-4o claude-sonnet-4-6
> 32,000 gpt-4o + warning claude-sonnet-4-6 + warning

Short lectures cost fractions of a cent; longer content automatically gets the stronger model.


Output

Each run writes one file per folder containing:

  1. Executive Summary — thesis, 5–10 key concepts, examples, what to remember
  2. Structured Reading — textbook-style prose with TOC, headings, and a synthesis
outputs/
  module_03_All.md
  module_03_All.docx
  module_03_All.tex

Design philosophy

  • Opinionated defaults, flexible overrides — sensible output for zero config, full control when you need it
  • Prompt-first — the transformation rules live in a Markdown file you can edit
  • Clean secret handling — API keys come from environment variables, never commands or code
  • Extensible — provider abstraction (see src/t2md/providers.py) makes it easy to add Ollama, Gemini, etc.

Roadmap

Supported input formats today: .txt, .md, .srt, .vtt, .pdf, .docx.

  • Local Ollama provider (scaffolding already in place)
  • Additional prompt presets (meeting notes, research papers, book chapters)
  • YouTube VTT / auto-caption ingestion
  • CSV / PPTX input

Contributing

Architecture is settled enough to use day-to-day but open enough that contributions can still shape direction. Issues, PRs, and ideas welcome — especially around new input formats and prompt presets.

License

MIT. Example transcripts in examples/ are MIT OCW content licensed under CC BY-NC-SA 4.0 — see examples/README.md for attribution.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

t2md-0.2.1.tar.gz (20.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

t2md-0.2.1-py3-none-any.whl (15.9 kB view details)

Uploaded Python 3

File details

Details for the file t2md-0.2.1.tar.gz.

File metadata

  • Download URL: t2md-0.2.1.tar.gz
  • Upload date:
  • Size: 20.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for t2md-0.2.1.tar.gz
Algorithm Hash digest
SHA256 2345d57837d8a911991034bf17589f0f741098eac4960ee794315ac9a9901e53
MD5 f4d0010fd8c924a8d603187003c2b96c
BLAKE2b-256 04b20733ae1553144cce8c808acae307c38fc4b8e242a373f1ac2957d3cfbc6d

See more details on using hashes here.

Provenance

The following attestation bundles were made for t2md-0.2.1.tar.gz:

Publisher: publish.yml on rraj7/t2md

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file t2md-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: t2md-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 15.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for t2md-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 a55b55f9a9b940cb865d93adc041aeec01bc87ad6de6ec0f1cf262690395621b
MD5 4b47bc2810d27aef6194e905aa3e8ec2
BLAKE2b-256 c45411cd521481e50a39c1c240b8d2b4675a24ba8e5b819b5dff3ea803e34851

See more details on using hashes here.

Provenance

The following attestation bundles were made for t2md-0.2.1-py3-none-any.whl:

Publisher: publish.yml on rraj7/t2md

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page