Skip to main content

A series of wrappers to allow for multiple AI model sources to behave as huggingface transformers models

Project description

The Universal Model Adapter

This package acts as an adapter between Huggingface Transformers and several different APIs. As of now, these are the Huggingface Inference API and the OpenAI Inference API.

This works by mock transformers.PreTrainedModel classes that share the same generate() method, but make API calls on the backend. Several dev models are also available for mocking generation or performing debugging tasks.

Use Case

This package is best used in projects that use multiple different model sources interchangeably. In these kinds of projects, a unified generation interface greatly simplifies a lot of code. For example, a project that uses text generated from both Huggingface models and GPT models from OpenAI's API.

Fine-Gained Source Control

An advantage of this package is that it can either automatically resolve the source of a model from its name, or you can specify the source (OpenAI, Huggingface, etc.) manually. This can be done through an extra parameter to the pretrained_from_...() methods. For example:

from universalmodels.interface import pretrained_from_name, ModelSrc

model_name = "mistralai/Mistral-7B-v0.1"
# This will automatically resolve the model's source to 
# a local Huggingface transformers model (ModelSrc.HF_LOCAL)
local_model, tokenizer = pretrained_from_name(model_name, model_src=ModelSrc.AUTO)

# This will attempt to start the FastChat service and run 
# a local instance of the OpenAI API to run optimized generation
fschat_model, tokenizer = pretrained_from_name(model_name, model_src=ModelSrc.OPENAI_API)

# This will create a mock model without any generation logic attached.
# This is useful for when the shell of a model is needed as a reference.
# This option does not load any local models into memory or activate FastChat.
mock_model, tokenizer = pretrained_from_name(model_name, model_src=ModelSrc.NO_LOAD)

Quick Start

Installing from PyPI

pip3 install "universalmodels[fastchat]"

Installing from Source

git clone https://github.com/matthew-pisano/UniversalModels
cd UniversalModels
pip3 install -e ".[fastchat]"

Installing the fastchat extra enables support for using fastchat on compatible locally installed huggingface models. See FastChat supported models for more information on which models are supported.

Example Usage

In the following example, note that the interfaces for the Huggingface and OpenAI modles are the same. This is the primary benefit of using this package.

import torch
from universalmodels import pretrained_from_name
from universalmodels.constants import set_seed

# Set the global seed to encourage deterministic generation 
# NOTE: DOES NOT affect OpenAI API models
set_seed(42)

# Huggingface model example
hf_model_name = "mistralai/Mixtral-8x7B-Instruct-v0.1"
hf_model, hf_tokenizer = pretrained_from_name(hf_model_name)

hf_tokens = hf_tokenizer.encode("Repeat the following: 'Hello there from a huggingface model'")
hf_resp_tokens = hf_model.generate(torch.Tensor([hf_tokens]).int())[0]
hf_response = hf_tokenizer.decode(hf_resp_tokens)
print(hf_response)

# OpenAI model example
oai_model_name = "openai/gpt-3.5"
oai_model, oai_tokenizer = pretrained_from_name(oai_model_name)

oai_tokens = oai_tokenizer.encode("Repeat the following: 'Hello there from an openai model'")
oai_resp_tokens = oai_model.generate(torch.Tensor([oai_tokens]).int())[0]
oai_response = oai_tokenizer.decode(oai_resp_tokens)
print(oai_response)

[!IMPORTANT] Make sure your API keys are set for OpenAI and Huggingface before using models that require them!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

universalmodels-0.0.6.tar.gz (16.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

universalmodels-0.0.6-py3-none-any.whl (16.0 kB view details)

Uploaded Python 3

File details

Details for the file universalmodels-0.0.6.tar.gz.

File metadata

  • Download URL: universalmodels-0.0.6.tar.gz
  • Upload date:
  • Size: 16.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.10.12

File hashes

Hashes for universalmodels-0.0.6.tar.gz
Algorithm Hash digest
SHA256 23e852104373013494134549a652e0f54f193527b0ea330fa7964963ac39bae5
MD5 5572f9c4760f5881236ccfcad4717f85
BLAKE2b-256 33128e1e291365d38889735b2ed447b56f1f2d3bf15fa9cb2c2fa50708ba5dac

See more details on using hashes here.

File details

Details for the file universalmodels-0.0.6-py3-none-any.whl.

File metadata

File hashes

Hashes for universalmodels-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 4677828da5931eb6a939a58cebe5ba22bd77f7aa10cd3733dac71e24c428c719
MD5 2274768cbad415b39c10340c927f1e8d
BLAKE2b-256 36c7f8cec04c0ecc49a620bf4c5766b1507a27b9f3c65d86b23c4424459cd418

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page