Skip to main content

Fire up your models with the flame 馃敟

Project description


Fire up your models with the flame 馃敟

Test And Publish workflow status Docker Push workflow status Coverage Package version PyPI - Python Version


Flama is a python library which establishes a standard framework for development and deployment of APIs with special focus on machine learning (ML). The main aim of the framework is to make ridiculously simple the deployment of ML APIs, simplifying (when possible) the entire process to a single line of code.

The library builds on Starlette, and provides an easy-to-learn philosophy to speed up the building of highly performant GraphQL, REST and ML APIs. Besides, it comprises an ideal solution for the development of asynchronous and production-ready services, offering automatic deployment for ML models.

Some remarkable characteristics:

  • Generic classes for API resources with the convenience of standard CRUD methods over SQLAlchemy tables.
  • A schema system (based on Marshmallow or Typesystem) which allows the declaration of inputs and outputs of endpoints very easily, with the convenience of reliable and automatic data-type validation.
  • Dependency injection to make ease the process of managing parameters needed in endpoints via the use of Components. Flama ASGI objects like Request, Response, Session and so on are defined as Components ready to be injected in your endpoints.
  • Components as the base of the plugin ecosystem, allowing you to create custom or use those already defined in your endpoints, injected as parameters.
  • Auto generated API schema using OpenAPI standard.
  • Auto generated docs, and provides a Swagger UI and ReDoc endpoints.
  • Automatic handling of pagination, with several methods at your disposal such as limit-offset and page numbering, to name a few.


Flama is fully compatible with all supported versions of Python. We recommend you to use the latest version available.

For a detailed explanation on how to install flama visit:

Getting Started

Visit to get started with Flama.


Visit to view the full documentation.


from flama import Flama

app = Flama(
    description="My first API",

def home():
        - Salute
        Returns a warming message.
        This is a more detailed description of the method itself.
        Here we can give all the details required and they will appear
        automatically in the auto-generated docs.
            description: Warming hello message!
    return {"message": "Hello 馃敟"}

This example will build and run a Hello 馃敟 API. To run it:

flama run examples.hello_flama:app


  • Jos茅 Antonio Perdiguero L贸pez (@perdy)
  • Miguel Dur谩n-Olivencia (@migduroli)


This project is absolutely open to contributions so if you have a nice idea, please read our contributing docs before submitting a pull request.

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

flama-1.0.1.tar.gz (270.6 kB view hashes)

Uploaded Source

Built Distribution

flama-1.0.1-py3-none-any.whl (302.8 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page