Skip to main content

No project description provided

Project description

+ Looking for 'openagent'? Because of a little name clash, it's now called 'dotagent'. 🤖+

Question:

I stumbled upon this repository. Is it production ready?

Answer:

Kudos on discovering this hidden treasure box! 🧭 While it's fairly stable and we're battle-testing it in our own production, we'd advise a bit of caution for immediate production use. It's got its quirks, and some of them have taken a cozy spot on our 'we'll-look-at-this-later' list. Jump in, play with it, or use any part of our code. It's all good with the MIT license.


I'm diving in, quirks and all!

Ahoy, adventurer! 🏴‍☠️ We're thrilled to have another daring coder join the fray. Here's to creating some coding magic together! ✨

The Origin Tale of dotagent

Here's our dream: An open and democratic AGI, untouched by the sneaky controls and hush-hush censorship of corporate overlords masquerading under 'alignment'. Remember the good ol' web days? We lost that freedom to the mobile moguls and their cheeky 30% 'because-we-said-so' tax. 🙄

Our moonshot? 🚀 A harmonious ensemble of domain-specific AI agents, working in unison so well, you'd think it's AGI. Join us in opening up the LAST tech frontier for all!

-----------------------------------------------------

Meet World's first AMS!

Ever heard of an Agent Management System (AMS)? No? Well, probably because we believe we came up with it! 🎩✨ dotagent proudly wears the badge of being the world's first AMS (yep, we're patting ourselves on the back here). Drawing inspiration from the nifty microservices, it equips developers with a treasure trove of tools to craft sturdy, trusty AI applications and those cool experimental autonomous agents.

🧱 Modularity

  • Multiplatform: Agents do not have to run on a single location or machine. Different components can run across various platforms, including the cloud, personal computers, or mobile devices.
  • Extensible: If you know how to do something in Python or plain English, you can integrate it with dotagent.

🚧 Guardrails

  • Set clear boundaries: Users can precisely outline what their agent can and cannot do. This safeguard guarantees that the agent remains a dynamic, self-improving system without overstepping defined boundaries.

🏗️ Greater control with Structured outputs

  • More Effective Than Chaining or Prompting: The prompt compiler unlocks the next level of prompt engineering, providing far greater control over LLMs than few-shot prompting or traditional chaining methods.
  • Superpowers to Prompt Engineers: It gives full power of prompt engineering, aligning with how LLMs actually process text. This understanding enables you to precisely control the output, defining the exact response structure and instructing LLMs on how to generate responses.

🏭 Powerful Prompt Compiler

The philosophy is to handle more processing at compile time and maintain better session with LLMs.

  • Pre-compiling prompts: By handling basic prompt processing at compile time, unnecessary redundant LLM processing are eliminated.
  • Session state with LLM: Maintaining state with LLMs and reusing KV caches can eliminate many redundant generations and significantly speed up the process for longer and more complex prompts. (only for opensource models)
  • Optimized tokens: Compiler can transform many output tokens into prompt token batches, which are cheaper and faster. The structure of the template can dynamically guide the probabilities of subsequent tokens, ensuring alignment with the template and optimized tokenization . (only for opensource models)
  • Speculative sampling (WIP): You can enhance token generation speed in a large language model by using a smaller model as an assistant. The method relies on an algorithm that generates multiple tokens per transformer call using a faster draft model. This can lead to upto 3x speedup in token generation .

📦 Containerized & Scalable

  • .🤖 files : Agents can be effortlessly exported into a simple .agent or .🤖 file, allowing them to run in any environment.
  • Agentbox (optional): Agents should be able to optimize computing resources inside a sandbox. You can use Agentbox locally or on a cloud with a simple API, with cloud agentbox offering additional control and safety.

-----------------------------------------------------

Installation

pip install dotagent

Common Errors

SQLite3 Version Error

If you encounter an error like:

Your system has an unsupported version of sqlite3. Chroma requires sqlite3 >= 3.35.0.

This is a very common issue with Chroma DB. You can find instructions to resolve this in the Chroma DB tutorial.

Here's the code for a full stack chat app with UI, all in a single Python file! (37 lines)

import dotagent.compiler as compiler
from dotagent.compiler._program import Log
from dotagent import memory
import chainlit as ui
from dotenv import load_dotenv
load_dotenv()

@ui.on_chat_start
def start_chat():
   compiler.llm = compiler.llms.OpenAI(model="gpt-3.5-turbo")


class ChatLog(Log):
   def append(self, entry):
       super().append(entry)
       print(entry)
       is_end = entry["type"] == "end"
       is_assistant = entry["name"] == "assistant"
       if is_end and is_assistant:
           ui.run_sync(ui.Message(content=entry["new_prefix"]).send())


memory = memory.SimpleMemory()

@ui.on_message
async def main(message: str):
   program = compiler(
       """
       {{#system~}}
       You are a helpful assistant
       {{~/system}}

       {{~#geneach 'conversation' stop=False}}
       {{#user~}}
       {{set 'this.user_text' (await 'user_text')  hidden=False}}
       {{~/user}}

       {{#assistant~}}
       {{gen 'this.ai_text' temperature=0 max_tokens=300}}
       {{~/assistant}}
       {{~/geneach}}""", memory = memory
   )

   program(user_text=message, log=ChatLog())

The UI will look something like this: -----------------------------------------------------


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_server-0.1.22.tar.gz (4.2 MB view details)

Uploaded Source

Built Distribution

llm_server-0.1.22-py3-none-any.whl (4.5 MB view details)

Uploaded Python 3

File details

Details for the file llm_server-0.1.22.tar.gz.

File metadata

  • Download URL: llm_server-0.1.22.tar.gz
  • Upload date:
  • Size: 4.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.0 CPython/3.9.2 Linux/5.15.130-20472-g682e24dd583b

File hashes

Hashes for llm_server-0.1.22.tar.gz
Algorithm Hash digest
SHA256 392d7ae1b1478862e7b93812a65e4ef3f06afe6a95dc489a11492197708ee9d7
MD5 f8b36e03457da6ee3eb4fcb0aa9c9a69
BLAKE2b-256 7db26c5551a43dd1852d536614b6e51e06c9cccd57ae50f0fa9a4b351177d2a0

See more details on using hashes here.

File details

Details for the file llm_server-0.1.22-py3-none-any.whl.

File metadata

  • Download URL: llm_server-0.1.22-py3-none-any.whl
  • Upload date:
  • Size: 4.5 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.0 CPython/3.9.2 Linux/5.15.130-20472-g682e24dd583b

File hashes

Hashes for llm_server-0.1.22-py3-none-any.whl
Algorithm Hash digest
SHA256 f37a343ec24d9f9e878b88120831ab590639b0d6f3d6074b7ec17cfca7663f6f
MD5 29aa025923ae8af9c62eff1acc5711c3
BLAKE2b-256 d8328b466e2ee814e3e1c33b9cd853caa23a29ca9be919b277fe7b138cec71e9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page