DJ AI Ideation Tools.
Project description
djai
Tools for DJ ideation powered by AI/ML. The package ships with a command-line interface that helps you analyse your Spotify likes so you can feed curated metadata into downstream machine-learning workflows.
Command Line Interface
Ensure you have a Spotify access token with the user-library-read scope. You can either pass it as a flag or export it as an environment variable:
export SPOTIFY_API_TOKEN="your-spotify-token"
djai --max-items 100 > liked_tracks.json
Key flags:
--token: Provide the Spotify token directly (defaults toSPOTIFY_API_TOKEN).--limit: Batch size per API call (max 50; defaults to 50).--max-items: Optional cap on total tracks to fetch.
The CLI calls Spotify's /v1/me/tracks endpoint, caches responses, downloads MP3 previews via yt-dlp (requires ffmpeg), falls back to a YouTube search when Spotify doesn't expose a preview URL, and runs demucs to separate stems (cached under .djai_cache/stems). Demucs also needs the diffq package; install it alongside Demucs. The command prints the total number of liked tracks retrieved.
.env Support
Store secrets in a .env file to keep them out of your shell history:
SPOTIFY_CLIENT_ID="your-client-id"
SPOTIFY_CLIENT_SECRET="your-client-secret"
SPOTIFY_API_TOKEN="your-spotify-token"
The CLI automatically loads .env from the current working directory (or parent directories) using python-dotenv. If an explicit API token is missing but client credentials are configured, djai launches a one-time Authorization Code flow using a temporary localhost listener to obtain fresh access_token and refresh_token values before fetching tracks. The resulting tokens are cached in .djai_session (ignored by git) so subsequent runs in the same directory reuse them. Downloaded previews are stored under .djai_cache/audio, and API responses are cached under .djai_cache for up to 30 days. Note that analysing a user's liked tracks still requires a token granted with the user-library-read scope.
Development
python -m venv .venv
source .venv/bin/activate
pip install --upgrade pip
pip install -e ".[dev]"
ruff check .
pytest
Continuous Integration
GitHub Actions runs linting (ruff) and tests (pytest) on pushes and pull requests targeting main via .github/workflows/ci.yml.
Publishing to PyPI
The manual Publish workflow in .github/workflows/deploy.yml builds and uploads the package. Before triggering it, bump the version in pyproject.toml and ensure a PyPI token is stored as PYPI_API_TOKEN.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file djai-0.4.0.tar.gz.
File metadata
- Download URL: djai-0.4.0.tar.gz
- Upload date:
- Size: 11.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9dd822e46cbe0057865fa84844f84e019057aa0425ab1646c0cb2bae57829546
|
|
| MD5 |
0c4f150dd072784b8f2a3765d92e29d7
|
|
| BLAKE2b-256 |
7c6e10cf196c6a18ba6e5a202cef73b704f14af91568f622abe7a6557b2b8859
|
File details
Details for the file djai-0.4.0-py3-none-any.whl.
File metadata
- Download URL: djai-0.4.0-py3-none-any.whl
- Upload date:
- Size: 11.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7e4afc48c15bea73bca43a4930b48128c48d5f00dd72cd7be3d321884d40036c
|
|
| MD5 |
84dc888b01002c0c75eae2bbf3b4a425
|
|
| BLAKE2b-256 |
cae65c575b551bc9358eb83d58d94cca2e80644fb5c1ac6550e8737058053cec
|