LLM toolkit for lightning-fast, high-quality development
Project description
Mirascope
Simplicity through idiomatic syntax | → | Faster and more reliable releases |
Semi-opinionated methods | → | Reduced complexity that speeds up development |
Reliability through validation | → | More robust applications with fewer bugs |
Mirascope is an open-source Python toolkit built on top of Pydantic that makes working with Large Language Models (LLMs):
- Durable: Seamlessly customize and extend functionality.
- Intuitive: Editor support that you expect (e.g. autocompletion, inline errors)
- Clean: Pydantic together with our Prompt CLI eliminates prompt-related bugs.
- Integrable: Easily integrate with JSON Schema and other tools such as FastAPI
- Convenient: Tooling that is clean, elegant, and delightful that you don't need to maintain.
- Open: Dedication to building open-source tools you can use with your choice of LLM.
We support any model that works with the OpenAI API, as well as other models such as Gemini.
Installation
Install Mirascope and start building with LLMs in minutes.
pip install mirascope
You can also install additional optional dependencies if you’re using those features:
pip install mirascope[wandb] # WandbPrompt
pip install mirascope[gemini] # GeminiPrompt, ...
Usage
With Mirascope, everything happens with prompts. The idea is to colocate any functionality that may impact the quality of your prompt — from the template variables to the temperature — so that you don’t need to worry about code changes external to your prompt affecting quality. For simple use-cases, we find that writing prompts as docstrings provides enhanced readability:
from mirascope.openai import OpenAICallParams, OpenAIPrompt
class BookRecommendation(OpenAIPrompt):
"""Please recommend a {genre} book."""
genre: str
call_params = OpenAICallParams(
model="gpt-4",
temperature=0.3,
)
recommendation = BookRecommendation(genre="fantasy").create()
print(recommendation)
#> I recommend "The Name of the Wind" by Patrick Rothfuss. It is...
If you add any of the OpenAI message roles (SYSTEM, USER, ASSISTANT, TOOL) as keywords to your prompt docstring, they will automatically get parsed into a list of messages:
from mirascope.openai import OpenAIPrompt
class BookRecommendation(OpenAIPrompt):
"""
SYSTEM:
You are the world's greatest librarian.
USER:
Please recommend a {genre} book.
"""
genre: str
prompt = BookRecommendation(genre="fantasy")
print(prompt.messages)
#> [{'role': 'system', 'content': "You are the world's greatest librarian."},
# {'role': 'user', 'content': 'Please recommend a fantasy book.'}]
If you want to write the messages yourself instead of using the docstring message parsing, there’s nothing stopping you!
from mirascope.openai import OpenAIPrompt
from openai.types.chat import ChatCompletionMessageParam
class BookRecommendation(OpenAIPrompt):
"""This is now just a normal docstring.
Note that you'll lose any functionality dependent on it,
such as `template`.
"""
genre: str
@property
def messages(self) -> list[ChatCompletionMessageParam]:
"""Returns the list of OpenAI prompt messages."""
return [
{"role": "system", "content": "You are the world's greatest librarian."},
{"role": "user", "content": f"Please recommend a {self.genre} book."},
]
recommendation = BookRecommendation(genre="fantasy").create()
print(recommendation)
#> I recommend "The Name of the Wind" by Patrick Rothfuss. It is...
Create, Stream, Extract
Prompt classes such as OpenAIPrompt
have three methods for interacting with the LLM:
create
: Generate a response given a prompt. This will generate raw text unless tools are provided as part of thecall_params
.stream
: Same ascreate
except the generated response is returned as a stream of chunks. All chunks together become the full completion.extract
: Convenience tooling built on top of tools to make it easy to extract structured information given a prompt and schema.
Using Different LLM Providers
The OpenAIPrompt
class supports any endpoint that supports the OpenAI API, including (but not limited to) Anyscale, Together, and Groq. Simply update the base_url
and set the proper api key in your environment:
import os
from mirascope.openai import OpenAICallParams, OpenAIPrompt
os.environ["OPENAI_API_KEY"] = "TOGETHER_API_KEY"
class BookRecommendation(OpenAIPrompt):
"""Please recommend a {genre} book."""
genre: str
call_params = OpenAICallParams(
model="mistralai/Mixtral-8x7B-Instruct-v0.1",
base_url="https://api.together.xyz/v1",
)
recommendation = BookRecommendation(genre="fantasy").create()
We also support other providers such as Gemini.
Dive Deeper
- Learn why colocation is so important and how combining it with the Mirascope CLI makes engineering better prompts easy.
- Check out how to write better prompts using Mirascope.
- Become a master of extracting structured information using LLMs.
- Take a look at how Mirascope makes using tools (function calling) simple and clean.
- The API Reference contains full details on all classes, methods, functions, etc.
Examples
You can find more usage examples in our examples directory, such as how to easily integrate with FastAPI.
We also have more detailed walkthroughs in our Cookbook docs section. Each cookbook has corresponding full code examples in the cookbook directory.
What’s Next?
We have a lot on our minds for what to build next, but here are a few things (in no particular order) that come to mind first:
- Extracting structured information using LLMs
- Agents
- Support for more LLM providers:
- Claude
- Mistral
- HuggingFace
- Integrations:
- Weights & Biases
- LangSmith
- … tell us what you’d like integrated!
- Evaluating prompts and their quality by version
- Additional docstring parsing for more complex messages
Versioning
Mirascope uses Semantic Versioning.
License
This project is licensed under the terms of the MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for mirascope-0.3.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 79549c438fd97c9bc8bc8c0525ebafaca0f330e08eb62769706cf7b310561e8d |
|
MD5 | 6d864a893cbcc4232711c925bb483bd1 |
|
BLAKE2b-256 | 22b991b4749b07a9c0ec8724d983942e61b349ea9e9ab495fee10826470cd938 |