Skip to main content

No project description provided

Project description

omh

PyPI - Version PyPI - Python Version


Getting Started

Another way to program a machine

When programming a machine to achieve a specific goal, we need to communicate our intent to it. Typically, this involves designing a detailed procedure and writing precise instructions in a programming language like Python.

However, some tasks are challenging to describe with step-by-step procedures because we often perform them intuitively without fully understanding the process. For example:

  • Determining the sentiment of a given sentence.
  • Identifying dishes mentioned in a food review.

omh introduces an alternative way to express our intentions to machines. Instead of relying on imperative procedures, it uses human language instructions and examples. Under the hood, it leverages language models and few-shot learning techniques to interpret and execute these instructions.

Getting Started

pip install omh

Hello World

from openai import OpenAI

openai_client = OpenAI(api_key='**********************')

from omh import def_fn


@def_fn.openai(client=openai_client, opts=my_gpt_opts)
@def_fn.with_instruction('Print Hello World')
@def_fn.with_example(output="Hello World")
@def_fn.with_default_args('Print Hello World')
def hello_world():
    # any code you wrote here will be ignored.
    pass


out = hello_world()
print(out.value)
# hello world

Quick Tutorial

Define a function

Use decorators from from omh import def_fn to define a few-shot learning based function, currently any function definition should include at least the following decorators in their decorators chains:

  • an implementation selector such as def_fn.openai() should be always on the top of the decorators chains, and you can provide custom inference parameters, for example: def_fn.openai(opts=GptApiOptions(temperature=1.0))
    the OpenAI client object and any custom.
  • use def_fn.with_instruction(instruction: str) to provide instructions.
  • use def_fn.with_example(input, output) to provide examples.

def_fn.openai

from omh import def_fn
from omh.impl.openai import GptApiOptions
from openai import OpenAI

openai_client = OpenAI(api_key='**********************')

my_gpt_opts = GptApiOptions(temperature=0)


@def_fn.openai(client=openai_client, opts=my_gpt_opts)
@def_fn.with_instruction('Print Hello World')
@def_fn.with_example(output="Hello World")
@def_fn.with_default_args('Print Hello World')
def hello_world():
    pass


out = hello_world()
out

# Output: Hello World

def_fn.with_example

from omh import def_fn
from omh.impl.openai import GptApiOptions
from openai import OpenAI

openai_client = OpenAI(api_key='**********************')

my_gpt_opts = GptApiOptions(temperature=0)


@def_fn.openai(client=openai_client, opts=my_gpt_opts)
@def_fn.with_instruction("Given a username, say welcome to the user.")
@def_fn.with_example(input='John', output="Welcome, John!")
@def_fn.with_default_args('New User')
def greet_new_user():
    # any code you wrote here will be ignored.
    pass
from omh import def_fn
from omh.impl.openai import GptApiOptions
from openai import OpenAI

openai_client = OpenAI(api_key='**********************')

my_gpt_opts = GptApiOptions(temperature=0)


@def_fn.openai(client=openai_client, opts=my_gpt_opts)
@def_fn.with_instruction("Extract all companies' tickers mentioned in the news title.")
@def_fn.with_example(
    input="Why Kroger, Albertsons need to merge immediately to compete with Walmart",
    output="$KR, $ACI, $WMT",
)
def find_tickers():
    # any code you wrote here will be ignored.
    pass
from omh import def_fn
from omh.impl.openai import GptApiOptions
from openai import OpenAI

openai_client = OpenAI(api_key='**********************')

my_gpt_opts = GptApiOptions(temperature=0)


@def_fn.openai(client=openai_client, opts=my_gpt_opts)
@def_fn.with_instruction("Given a username, say welcome to the user.")
@def_fn.with_example(input='John', output="Welcome, John!")
@def_fn.with_default_args('New User')
def greet_new_user():
    # any code you wrote here will be ignored.
    pass
from omh import def_fn
from omh.impl.openai import GptApiOptions
from openai import OpenAI

openai_client = OpenAI(api_key='**********************')

my_gpt_opts = GptApiOptions(temperature=0)


@def_fn.openai(client=openai_client, opts=my_gpt_opts)
@def_fn.with_instruction(
    "Extract all wikipedia entities mentioned in the text and format them in JSON as following [{name: '', url: ''}].")
@def_fn.with_example(
    input="An analog computer or analogue computer is a type of computer that uses the continuous variation"
          "aspect of physical phenomena such as electrical, mechanical, or hydraulic quantities (analog signals) "
          "to model the problem being solved.",
    output=[
        {
            "name": "computer",
            "url": "https://en.wikipedia.org/wiki/Computation",

        },
        {
            "name": "electrical",
            "url": "https://en.wikipedia.org/wiki/Electrical_network",
        },
        {
            "name": "mechanical",
            "url": "https://en.wikipedia.org/wiki/Mechanics",
        },
        {
            "name": "hydraulic",
            "url": "https://en.wikipedia.org/wiki/Hydraulics",
        },
        {
            "name": "analog signals",
            "url": "https://en.wikipedia.org/wiki/Analog_signal",
        }
    ]
)
def extract_wiki_links(text):
    # any code you wrote here will be ignored.
    pass


out = extract_wiki_links(
    "The term machine learning was coined in 1959 by Arthur Samuel, an IBM employee and pioneer in the field of computer gaming and artificial intelligence.")

# To access raw text output from LLM.
print(out.raw())

# to access output, noted this may fail if language model backed failed to produce a valid json output.
print(out.value)

def_fn.with_default_args

from omh import def_fn
from omh.impl.openai import GptApiOptions
from openai import OpenAI

openai_client = OpenAI(api_key='**********************')

my_gpt_opts = GptApiOptions(temperature=0)


@def_fn.openai(client=openai_client, opts=my_gpt_opts)
@def_fn.with_instruction("Given a username, say welcome to the user.")
@def_fn.with_example(input='John', output="Welcome, John!")
@def_fn.with_default_args('New User')
def greet_new_user():
    # any code you wrote here will be ignored.
    pass


out = greet_new_user()
print(out.value)
# Welcome, New User!

out = greet_new_user('Bob')
print(out.value)
# Welcome, Bob!

def_fn.require_args

You can use require_args to set the required keyword arguments for your function, and the function will throw exception when any of those values missing from kwargs.

from omh import def_fn
from omh.impl.openai import GptApiOptions
from openai import OpenAI

openai_client = OpenAI(api_key='**********************')

my_gpt_opts = GptApiOptions(temperature=0)


@def_fn.openai(client=openai_client, opts=my_gpt_opts)
@def_fn.with_instruction("Given a username, say welcome to the user.")
@def_fn.with_example(input={'username': 'John'}, output="Welcome, John!")
@def_fn.require_args('username')
def greet_new_user():
    # any code you wrote here will be ignored.
    pass


out = greet_new_user(username='Bob')
print(out.value)
# Welcome, Bob!

greet_new_user(user='Alice')
# This line will throw error as keyword argument `username` is required

Using the output

out = greet_new_user(username='Bob')
print(out.value)

Batch Mode

Most language model backend nowadays offer batch mode with significant saving in cost if you can afford a delayed response,

You can call omh function in batch mode to take advantage of this, as omh functions' batch mode` offers an easy interface to submit batch job, check status, and, fetching remote response.

from omh import def_fn
from omh.impl.openai import GptApiOptions
from openai import OpenAI

openai_client = OpenAI(api_key='**********************')

my_gpt_opts = GptApiOptions(temperature=0)


@def_fn.openai(client=openai_client, opts=my_gpt_opts)
@def_fn.with_instruction(
    "Extract all wikipedia entities mentioned in the text and format them in JSON as following [{name: '', url: ''}].")
@def_fn.with_example(
    input="An analog computer or analogue computer is a type of computer that uses the continuous variation"
          "aspect of physical phenomena such as electrical, mechanical, or hydraulic quantities (analog signals) "
          "to model the problem being solved.",
    output=[
        {
            "name": "computer",
            "url": "https://en.wikipedia.org/wiki/Computation",

        },
        {
            "name": "electrical",
            "url": "https://en.wikipedia.org/wiki/Electrical_network",
        },
        {
            "name": "mechanical",
            "url": "https://en.wikipedia.org/wiki/Mechanics",
        },
        {
            "name": "hydraulic",
            "url": "https://en.wikipedia.org/wiki/Hydraulics",
        },
        {
            "name": "analog signals",
            "url": "https://en.wikipedia.org/wiki/Analog_signal",
        }
    ]
)
def extract_wiki_links(text):
    # any code you wrote here will be ignored.
    pass

######################
# Create a batch job
######################

batch = extract_wiki_links.create_batch('my-batch')
batch.add(
    "The term machine learning was coined in 1959 by Arthur Samuel, an IBM employee and pioneer in the field of computer gaming and artificial intelligence.")
batch.add(
    "Fermat and Lagrange found calculus-based formulae for identifying optima, while Newton and Gauss proposed iterative methods for moving towards an optimum.")

######################
# Submit the batch job to language model backend
######################
batch.start_batch()


######################
# Sync remote status, 
# batch.sync_remote() will fetch results if available.
######################
status = batch.sync_remote()
print(status)
print(status['openai']['batch_job'])

######################
# Sync remote status, 
# once the batch is completed, you can iterate through the results using `iter_outputs`
######################
 
for input_args, output in batch.iter_outputs():
    print(output)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

omh-0.0.2.tar.gz (38.6 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

omh-0.0.2-py3-none-any.whl (21.6 kB view details)

Uploaded Python 3

File details

Details for the file omh-0.0.2.tar.gz.

File metadata

  • Download URL: omh-0.0.2.tar.gz
  • Upload date:
  • Size: 38.6 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.27.0

File hashes

Hashes for omh-0.0.2.tar.gz
Algorithm Hash digest
SHA256 5586c5fdbabe0619c3d83da844766a2530d98542ad882c607ffc6568cf6b5367
MD5 50f11d145dd2b6233434200c252931c3
BLAKE2b-256 3f7cb2884dc109a0f8a60bceccb5ebba6ddfbaa9b75ace56fe5998935b4dc88a

See more details on using hashes here.

File details

Details for the file omh-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: omh-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 21.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.27.0

File hashes

Hashes for omh-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 36437e9fa13ae5fdf84e3518d26789b731cc280ed9bf45c7813b25470fa9d070
MD5 89563ab884296ccf674b6cea4f507384
BLAKE2b-256 b95453fd6ec05cfcd526fa7c9c6d5208ef834174fbdee4547cf53d5d2b7ffc32

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page