Skip to main content

byLLM Provides Easy to use APIs for different LLM Providers to be used with Jaseci's Jaclang Programming Language.

Project description

byLLM : Prompt Less, Smile More!

PyPI version tests Discord

byLLM is an innovative AI integration framework built for the Jaseci ecosystem, implementing the cutting-edge Meaning Typed Programming (MTP) paradigm. MTP revolutionizes AI integration by embedding prompt engineering directly into code semantics, making AI interactions more natural and maintainable. While primarily designed to complement the Jac programming language, byLLM also provides a powerful Python library interface.

Installation is simple via PyPI:

pip install byllm

Basic Example

Consider building an application that translates english to other languages using an LLM. This can be simply built as follows:

import from byllm.lib { Model }

glob llm = Model(model_name="gpt-4o");

def translate_to(language: str, phrase: str) -> str by llm();

with entry {
    output = translate_to(language="Welsh", phrase="Hello world");
    print(output);
}

This simple piece of code replaces traditional prompt engineering without introducing additional complexity.

Power of Types with LLMs

Consider a program that detects the personality type of a historical figure from their name. This can eb built in a way that LLM picks from an enum and the output strictly adhere this type.

import from byllm.lib { Model }
glob llm = Model(model_name="gemini/gemini-2.0-flash");

enum Personality {
    INTROVERT, EXTROVERT, AMBIVERT
}

def get_personality(name: str) -> Personality by llm();

with entry {
    name = "Albert Einstein";
    result = get_personality(name);
    print(f"{result} personality detected for {name}");
}

Similarly, custom types can be used as output types which force the LLM to adhere to the specified type and produce a valid result.

Control! Control! Control!

Even if we are elimination prompt engineering entierly, we allow specific ways to enrich code semantics through docstrings and semstrings.

"""Represents the personal record of a person"""
obj Person {
    has name: str;
    has dob: str;
    has ssn: str;
}

sem Person.name = "Full name of the person";
sem Person.dob = "Date of Birth";
sem Person.ssn = "Last four digits of the Social Security Number of a person";

"""Calculate eligibility for various services based on person's data."""
def check_eligibility(person: Person, service_type: str) -> bool by llm();

Docstrings naturally enhance the semantics of their associated code constructs, while the sem keyword provides an elegant way to enrich the meaning of class attributes and function arguments. Our research shows these concise semantic strings are more effective than traditional multi-line prompts.

Configuration

Project-wide Configuration (jac.toml)

Configure byLLM behavior globally using jac.toml:

[plugins.byllm]
system_prompt = "You are a helpful assistant..."

[plugins.byllm.model]
default_model = "gpt-4o-mini"

[plugins.byllm.call_params]
temperature = 0.7

This enables centralized control over:

  • System prompts across all LLM calls
  • Default model selection
  • Common parameters like temperature

Custom Model Endpoints

Connect to custom or self-hosted models:

import from byllm.lib { Model }

glob llm = Model(
    model_name="custom-model",
    config={
        "api_base": "https://your-endpoint.com/v1/chat/completions",
        "api_key": "your_key",
        "http_client": True
    }
);

How well does byLLM work?

byLLM is built using the underline priciple of Meaning Typed Programming and we shown our evaluation data compared with two such AI integration frameworks for python, such as DSPy and LMQL. We show significant performance gain against LMQL while allowing on par or better performance to DSPy, while reducing devloper complexity upto 10x.

Full Documentation: Jac byLLM Documentation

Complete Examples:

Research: The research journey of MTP is available on Arxiv and accepted for OOPSLA 2025.

Quick Links

Contributing

We welcome contributions to byLLM! Whether you're fixing bugs, improving documentation, or adding new features, your help is appreciated.

Areas we actively seek contributions:

  • Bug fixes and improvements
  • Documentation enhancements
  • New examples and tutorials
  • Test cases and benchmarks

Please see our Contributing Guide for detailed instructions.

If you find a bug or have a feature request, please open an issue.

Community

Join our vibrant community:

License

This project is licensed under the MIT License.

Third-Party Dependencies

byLLM integrates with various LLM providers (OpenAI, Anthropic, Google, etc.) through LiteLLM.

Cite our research

Jayanaka L. Dantanarayana, Yiping Kang, Kugesan Sivasothynathan, Christopher Clarke, Baichuan Li, Savini Kashmira, Krisztian Flautner, Lingjia Tang, and Jason Mars. 2025. MTP: A Meaning-Typed Language Ab- straction for AI-Integrated Programming. Proc. ACM Program. Lang. 9, OOPSLA2, Article 314 (October 2025), 29 pages. https://doi.org/10.1145/3763092

Jaseci Contributors

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

byllm-0.4.16.tar.gz (36.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

byllm-0.4.16-py3-none-any.whl (31.4 kB view details)

Uploaded Python 3

File details

Details for the file byllm-0.4.16.tar.gz.

File metadata

  • Download URL: byllm-0.4.16.tar.gz
  • Upload date:
  • Size: 36.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for byllm-0.4.16.tar.gz
Algorithm Hash digest
SHA256 74803b0ad12e8b2c2e8f92ff15a85d7beb961b656c3264f14b4f30515d9ed358
MD5 a8d772e96f69de242007d1b0aa96fa74
BLAKE2b-256 ab8ce54e26db51eb366fb4b45f2be82079f28ad89c9564e6a33002e7cf580143

See more details on using hashes here.

File details

Details for the file byllm-0.4.16-py3-none-any.whl.

File metadata

  • Download URL: byllm-0.4.16-py3-none-any.whl
  • Upload date:
  • Size: 31.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for byllm-0.4.16-py3-none-any.whl
Algorithm Hash digest
SHA256 0963113c4e433d542018ecec06475543145542e4d276a8aa91b7dbb80fa0d4df
MD5 d2a57b6c8fdef92bec7bd5ed66dc53d4
BLAKE2b-256 3270d4b807c2f51aaff34fef81ce77e35f3a6dc467d1270fc42cac749a23be78

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page