Skip to main content

A batteries-included library for building AI-powered software.

Project description

Marvin 🤖🏖️

Meet Marvin: a batteries-included library for building AI-powered software. Marvin's job is to integrate AI directly into your codebase by making it look and feel like any other function.

Marvin introduces a new concept called AI Functions. These functions differ from conventional ones in that they don’t rely on source code, but instead generate their outputs on-demand with AI by using an LLM as a runtime. With AI functions, you don't have to write complex code for tasks like extracting entities from web pages, scoring sentiment, or categorizing items in your database. Just describe your needs, call the function, and you're done!

AI functions work with native data types, so you can seamlessly integrate them into any codebase and chain them into sophisticated pipelines. Technically speaking, Marvin transforms the signature of using AI from (str) -> str to (**kwargs) -> Any. We call this "functional prompt engineering."

In addition to AI functions, Marvin also introduces more flexible bots. Bots are highly capable AI assistants that can be given specific instructions and personalities or roles. They can use custom plugins and leverage external knowledge, and automatically create a history of every thread. Under the hood, AI functions are actually a type of bot.

To make it easy to work with bots, Marvin includes a fully-functional TUI. The TUI tracks threads across multiple bots and even lets you manage your bots through a conversational interface.

Developers can use Marvin to add AI capabilities wherever they will be most impactful, without needing to start from scratch. Marvin's documentation is available at askmarvin.ai, and say hello on our Discord server!

Features

🪄 Write AI functions that process structured data without source code

🤖 Build bots that have personalities and follow instructions

🖥️ Chat with bots in a fully-featured TUI

🔌 Give your bots new abilities with plugins

📚 Store knowledge that bots can access and use

📡 Available as a Python API, interactive CLI, or FastAPI server

Quick start

  1. Install: pip install marvin
  2. Chat: marvin chat

Slightly less quick start

Create a bot:

marvin bots create ObiWanKenoBot -p "knows every Star Wars meme"

Chat with it:

marvin chat -b ObiWanKenoBot

See the getting started docs for more!

Open Source

Marvin is open-source with an Apache 2.0 license and built on standards like Pydantic, FastAPI, Langchain, and Prefect.

🚧 Construction Zone

Marvin is under active development and is likely to change.

Coming soon:

♻️ Interactive AI functions

🖼️ Admin and chat UIs

🏗️ Advanced data loading and preprocessing

🔭 AI observability platform

🖥️ Deployment guides

🎁 Quickstarts for common use cases

When should you use Marvin?

Marvin is an opinionated, high-level library with the goal of integrating AI tools into software development. There are a few major reasons to use Marvin:

  1. You want an AI function that can process structured data. Marvin brings the power of AI to native data structures, letting you build functions that would otherwise be difficult or even impossible to write. For example, you can use AI functions to make a list of all the animals in a paragraph, generate JSON documents from HTML content, extract keywords that match some criteria, or categorize sentiment -- without any traditional source code.

  2. You want an AI assistant in your code. Marvin's bots can follow instructions and hold conversations to solve complex problems. They can use custom plugins and take advantage of external knowledge. They are designed to be integrated into your codebase, but of course you can expose them directly to your users as well!

  3. You want to deploy cutting-edge AI technology with confidence, but without having to make too many decisions. Using LLMs successfully requires very careful consideration of prompts, data preprocessing, and infrastructure. Our target user is more interested in using AI systems than building AI systems. Therefore, Marvin is designed to make adopting this technology as straightforward as possible by optimizing for useful outcomes. Marvin's prompts have been hardened by months of real-world use and will continue to improve over time.

When should you NOT use Marvin?

There are a few reasons NOT to use Marvin:

  1. You want full control of the AI. Marvin is a high-level library and (with few exceptions) does not generally expose LLM configuration to users. We have chosen settings that give the best results under most circumstances, taking Marvin's built-in prompts into consideration.

  2. You want an AI copilot for writing code. Marvin's job isn't to help you write source code; it's to help you do things that are difficult or impossible to express in source code. That could range from mundane activities to writing a function that can extract the names of animals commonly found in North America from an email (yes, it's a ridiculous example - but it's possible). Modern LLMs excel at complex reasoning, and Marvin lets you bring that into your code in a way that feels native and natural.

  3. You want to use other LLM models. Marvin is designed to run against OpenAI's GPT-4 and GPT-3.5 models. While we may expand those models in the future, we've discovered that prompts designed for one model rarely translate well to others without modification. In order to maximize the usefulness of the library, we've decided to focus on just these popular models for now.

  4. You want full control of your prompts. As a "functional prompt engineering" platform, Marvin takes user inputs and generates prompts that are likely to deliver the outcome the user wants, even if they are not verbatim what the user said. Marvin does not expect users to send completely raw prompts to the LLM.

  5. You're searching for the Ultimate Question. While Marvin is highly intelligent, even he couldn't come up with the Ultimate Question of Life, the Universe, and Everything. If you're seeking existential enlightenment, you might need to look beyond our beloved paranoid android.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

marvin-0.8.0.tar.gz (16.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

marvin-0.8.0-py3-none-any.whl (114.5 kB view details)

Uploaded Python 3

File details

Details for the file marvin-0.8.0.tar.gz.

File metadata

  • Download URL: marvin-0.8.0.tar.gz
  • Upload date:
  • Size: 16.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.3

File hashes

Hashes for marvin-0.8.0.tar.gz
Algorithm Hash digest
SHA256 3a86875065d447cf2b1d6b7cb86b906db5f7ffabdbc39c7a3e1b9e7f14409b26
MD5 a8d741252afda4c205d099622d8f2a10
BLAKE2b-256 4a08102f9809b5e150063d0550bad2f726ea363f0169002feefaea55bb6568ae

See more details on using hashes here.

File details

Details for the file marvin-0.8.0-py3-none-any.whl.

File metadata

  • Download URL: marvin-0.8.0-py3-none-any.whl
  • Upload date:
  • Size: 114.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.3

File hashes

Hashes for marvin-0.8.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7be4281e4ea6fd50f16cb5afa3ac7dfe8c739e24ac5152f0fa1e36766e6b7dd3
MD5 cb3122f0e07d71c710c5efb9b130d2eb
BLAKE2b-256 4b9c65e2545ee4ffc9eb63a909a004bb26d587844ff38190711ddaab810805f9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page