Skip to main content

DSPy

Project description

DSPy: Programming—not prompting—Foundation Models

Documentation: DSPy Docs

PyPI Downloads

ASYNC DSPY

This is a fork of DSPy that has been modified to be fully async. Underlying behavior is untouched, with the exception of global per-thread settings overrides. The goal of this fork is to maintain parity and release cadence with DSPy (which is not complex or a large time-sink, given that the vast majority of changes are spamming async/await on various methods). It aims to be a near-drop-in replacement for dspy.

The high-level changes are as follows:

  • Calls to tools, metrics, and modules must be awaited
  • Implementations of tools, metrics, and modules must be declared as async
    • Including __call__, forward
  • The dspy Settings object is now passed forward into every __call__ and forward method - as well as the callbacks - instead of being overridden globally on a per-thread basis
    • This allows multiple dspy instances to be used in the same thread without mutating the settings context of other running dspy instances

For examples on how to use dspy-async, see:


DSPy is the framework for programming—rather than prompting—language models. It allows you to iterate fast on building modular AI systems and offers algorithms for optimizing their prompts and weights, whether you're building simple classifiers, sophisticated RAG pipelines, or Agent loops.

DSPy stands for Declarative Self-improving Python. Instead of brittle prompts, you write compositional Python code and use DSPy to teach your LM to deliver high-quality outputs. Learn more via our official documentation site or meet the community, seek help, or start contributing via this GitHub repo and our Discord server.

Documentation: dspy.ai

Please go to the DSPy Docs at dspy.ai

Installation

pip install dspy-async

To install the very latest from full_async:

pip install git+https://github.com/swiftdevil/dspy.git@full_async

📜 Citation & Reading More

If you're looking to understand the framework, please go to the DSPy Docs at dspy.ai.

If you're looking to understand the underlying research, this is a set of our papers:

[Jun'24] Optimizing Instructions and Demonstrations for Multi-Stage Language Model Programs
[Oct'23] DSPy: Compiling Declarative Language Model Calls into Self-Improving Pipelines
[Jul'24] Fine-Tuning and Prompt Optimization: Two Great Steps that Work Better Together
[Jun'24] Prompts as Auto-Optimized Training Hyperparameters
[Feb'24] Assisting in Writing Wikipedia-like Articles From Scratch with Large Language Models
[Jan'24] In-Context Learning for Extreme Multi-Label Classification
[Dec'23] DSPy Assertions: Computational Constraints for Self-Refining Language Model Pipelines
[Dec'22] Demonstrate-Search-Predict: Composing Retrieval & Language Models for Knowledge-Intensive NLP

To stay up to date or learn more, follow @lateinteraction on Twitter.

The DSPy logo is designed by Chuyi Zhang.

If you use DSPy or DSP in a research paper, please cite our work as follows:

@inproceedings{khattab2024dspy,
  title={DSPy: Compiling Declarative Language Model Calls into Self-Improving Pipelines},
  author={Khattab, Omar and Singhvi, Arnav and Maheshwari, Paridhi and Zhang, Zhiyuan and Santhanam, Keshav and Vardhamanan, Sri and Haq, Saiful and Sharma, Ashutosh and Joshi, Thomas T. and Moazam, Hanna and Miller, Heather and Zaharia, Matei and Potts, Christopher},
  journal={The Twelfth International Conference on Learning Representations},
  year={2024}
}
@article{khattab2022demonstrate,
  title={Demonstrate-Search-Predict: Composing Retrieval and Language Models for Knowledge-Intensive {NLP}},
  author={Khattab, Omar and Santhanam, Keshav and Li, Xiang Lisa and Hall, David and Liang, Percy and Potts, Christopher and Zaharia, Matei},
  journal={arXiv preprint arXiv:2212.14024},
  year={2022}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dspy_async-2.6.12.tar.gz (206.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dspy_async-2.6.12-py3-none-any.whl (260.4 kB view details)

Uploaded Python 3

File details

Details for the file dspy_async-2.6.12.tar.gz.

File metadata

  • Download URL: dspy_async-2.6.12.tar.gz
  • Upload date:
  • Size: 206.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.2

File hashes

Hashes for dspy_async-2.6.12.tar.gz
Algorithm Hash digest
SHA256 9514cc6b4928d9f9fc51e6bd6af8d8c438c0acec7250324cdb64f3f66c1b33be
MD5 53f1e7fb9fa7d82a7dfb199bf566a3bf
BLAKE2b-256 13c6df535f8e774b0c48f5551c0dc8d2b91b4cddd2e7d74172dc29c14153e1d1

See more details on using hashes here.

File details

Details for the file dspy_async-2.6.12-py3-none-any.whl.

File metadata

  • Download URL: dspy_async-2.6.12-py3-none-any.whl
  • Upload date:
  • Size: 260.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.2

File hashes

Hashes for dspy_async-2.6.12-py3-none-any.whl
Algorithm Hash digest
SHA256 837c4c595f04c12f0067a39e40440134db17c6ceda2e1a2e5a63cf48c3d7da25
MD5 87d5c6a2fdf1e2841b99c7003e02822b
BLAKE2b-256 af3a4e9118983cbc397f703bc603f2d00c36ca6105b7c386c39202bd3ff74b7a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page