LLM test interactions recorder for langchain, based on vcrpy
Project description
VCR langchain
Adapts VCR.py for use with langchain so that you can cache all your expensive LLM interactions in tests.
Quickstart
pip install vcr_langchain
Use it with pytest:
import vcr_langchain as vcr
from langchain.llms import OpenAI
@vcr.use_cassette()
def test_use_as_test_decorator():
llm = OpenAI(model_name="text-ada-001")
assert llm("Tell me a surreal joke") == "<put the output here>"
The next time you run it:
- the output is now deterministic
- it executes a lot faster by replaying from cache
- you no longer need to have the real OpenAI API key defined
For more examples, see the usages test file.
Documentation
For more information on how VCR works and what other options there are, please see the VCR docs.
For more information on how to use langchain, please see the langchain docs.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
vcr_langchain-0.0.1.tar.gz
(4.6 kB
view hashes)
Built Distribution
Close
Hashes for vcr_langchain-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | db58d999f02318cf2d0e7e4bc64d494ba60ba59f495fdb4cdefa5c2444c34012 |
|
MD5 | 03faef502379816b76ecc224d973f0e0 |
|
BLAKE2b-256 | bbe8d07d0ed1103224b7d49e9f563ef921512ebbf7c386dd733de88b730fea84 |