Skip to main content

No project description provided

Project description

Hal9: Create and Share Generative Apps

License: MIT Hal9 PyPi Downloads Hal9 JS Downloads GitHub star chart

Create and deploy generative (LLMs and diffusers) applications (chatbots and APIs) in seconds.

  • Open: Use any model (OpenAI, Llama, Groq, MidJourney) and any library like (LangChain, DSPy).
  • Intuitive: No need to learn app frameworks (Flask), simply use input() and print(), or write file to disk.
  • Scalable: Engineers can integrate your app with scalable technologies (Docker, Kubernetes, etc)
  • Powerful: Using an OS process (stdin, stdout, files) as our app contract, enables long-running agents, multiple programming languages, and complex system dependencies.

Focus on AI (RAG, fine-tuning, alignment, training) and skip engineering tasks (frontend development, backend integration, deployment, operations).

Getting started

Create and share a chatbot in seconds as follows:

pip install hal9

hal9 create chatbot
hal9 deploy chatbot

Notice that deploy needs a HAL9_TOKEN environment variable with an API token you can get from hal9.com/devs. You can use this token to deploy from your local computer, a notebook or automate from GitHub.

HAL9_TOKEN=H9YOURTOKEN hal9 deploy chatbot

The code inside /chatbot/app.py contains a "Hello World" chatbot that reads the user prompt and echos the result back:

prompt = input()
print(f"Echo: {prompt}")

We designed this package with simplicity in mind, the job of the code is to read input and write output, that's about it. That said, you can create chatbots that use LLMs, generate images, or even use tools that connect to databases, or even build websites and games!

Creation

By default hal9 create defaults to the --template echo template, but you can choose different ones as follows:

hal9 create chatbot-openai --template openai
hal9 create chatbot-groq --template groq

A template provides ready to use code with specific technologies and use cases. Is very popular to use OpenAI's ChatGPT-like template with --template openai, the code generated will look as follows:

import hal9 as h9
from openai import OpenAI

messages = h9.load("messages", [])
prompt = h9.input(messages = messages)

completions = OpenAI().chat.completions.create(model = "gpt-4", messages = messages, stream = True)

h9.complete(completions, messages = messages)
h9.save("messages", messages, hidden = True)

The Learn section explain in detail how this code works, but will provide a quick overview. The hal9 package contains a helper functions to simplify your generative AI code. You can choose to not use hal9 at all and use input() and print() statements yourself, or even sue tools like langchain. The h9.load() and h9.save() functions load and save data across chat sessions, our platform is stateless by default. The h9.input() function is a slim wrapper over input() that also stores the user input in the messages. Then h9.complete() is a helper function to help parse the completion results and save the result in messages. That's about it!

Development

To make changes to your project, open chatbot/ in your IDE and modify chatbot/app.py.

You can then run your project as follows:

hal9 run chatbot

If you customized your template with --template make sure to set the correct key, for example export OPENAI_KEY=YOUR_OPENAI_KEY.

You can then run your application locally with:

hal9 run chatbot

This command is just a convenience wrapper to running the code yourself with something like python app.py.

Deployment

The deploy command will prepare for deployment your generative app.

For example, you can prepare deployment as a generative app (Hal9). We have plans to also provide deployment to Docker and the open source community can expand this even further.

hal9 deploy chatbot --target hal9

Each command is tasked with preparing the deployment of your project folder. For example, --target docker should create a Dockerfile file that gets this project ready to run in cloud containers.

For personal use, --target hal9 supports a free tier at hal9.com; enterprise support is also available to deploy with --target hal9 --url hal9.yourcompany.com

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hal9-2.8.0.tar.gz (12.4 kB view details)

Uploaded Source

Built Distribution

hal9-2.8.0-py3-none-any.whl (14.9 kB view details)

Uploaded Python 3

File details

Details for the file hal9-2.8.0.tar.gz.

File metadata

  • Download URL: hal9-2.8.0.tar.gz
  • Upload date:
  • Size: 12.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.11.0 Linux/6.5.0-1025-azure

File hashes

Hashes for hal9-2.8.0.tar.gz
Algorithm Hash digest
SHA256 0637a812c5bb13d3cab09b353b65178d5b5c3d5d8cef4c4ab31dbde7d8bb3a32
MD5 6aae12b94e1b4a934b64ddbd58e6421c
BLAKE2b-256 f238e7cc48a01edc45cab4149bc7164df94274771902debff2af03cc36954b79

See more details on using hashes here.

File details

Details for the file hal9-2.8.0-py3-none-any.whl.

File metadata

  • Download URL: hal9-2.8.0-py3-none-any.whl
  • Upload date:
  • Size: 14.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.11.0 Linux/6.5.0-1025-azure

File hashes

Hashes for hal9-2.8.0-py3-none-any.whl
Algorithm Hash digest
SHA256 eca4f259ce42ab3259d284c65c3a6391f09d76109cfdb56982b658591d728c97
MD5 fc88a136629c1a86c0b0dde2919cde4d
BLAKE2b-256 619c7f68c3c6a3e67438036041de0ab802b1c1a67a6fd367b25221c7b5930682

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page