Skip to main content

Simple inference for large language models

Project description

Language Models

PyPI version docs x64 Build ARM64 BuildNetlify Status Open In Colab

Python building blocks to explore large language models on any computer with 512MB of RAM

Try with Replit Badge

Translation hello world example

Target Audience

This package is designed to be as simple as possible for learners and educators exploring how large language models intersect with modern software development. The interfaces to this package are all simple functions using standard types. The complexity of large language models is hidden from view while providing free local inference using light-weight, open models. All included models are free for educational use, no API keys are required, and all inference is performed locally by default.

Installation and Getting Started

This package can be installed using the following command:

pip install languagemodels

Once installed, you should be able to interact with the package in Python as follows:

>>> import languagemodels as lm
>>> lm.do("What color is the sky?")
'The color of the sky is blue.'

This will require downloading a significant amount of data (~250MB) on the first run. Models will be cached for later use and subsequent calls should be quick.

Example Usage

Here are some usage examples as Python REPL sessions. This should work in the REPL, notebooks, or in traditional scripts and applications.

Text Completions

>>> import languagemodels as lm

>>> lm.complete("She hid in her room until")
'she was sure she was safe'

Instruction Following

>>> import languagemodels as lm

>>> lm.do("Translate to English: Hola, mundo!")
'Hello, world!'

>>> lm.do("What is the capital of France?")
'Paris.'

Chat

>>> lm.chat('''
...      System: Respond as a helpful assistant.
...
...      User: What time is it?
...
...      Assistant:
...      ''')
'I'm sorry, but as an AI language model, I don't have access to real-time information. Please provide me with the specific time you are asking for so that I can assist you better.'

External Retrieval

Helper functions are provided to retrieve text from external sources that can be used to augment prompt context.

>>> import languagemodels as lm

>>> lm.get_wiki('Chemistry')
'Chemistry is the scientific study...

>>> lm.get_weather(41.8, -87.6)
'Partly cloudy with a chance of rain...

>>> lm.get_date()
'Friday, May 12, 2023 at 09:27AM'

Here's an example showing how this can be used (compare to previous chat example):

>>> lm.chat(f'''
...      System: Respond as a helpful assistant. It is {lm.get_date()}
...
...      User: What time is it?
...
...      Assistant:
...      ''')
'It is currently Wednesday, June 07, 2023 at 12:53PM.'

Semantic Search

Semantic search is provided to retrieve documents that may provide helpful context from a document store.

>>> import languagemodels as lm

>>> lm.store_doc("Mars is a planet")
>>> lm.store_doc("The sun is hot")
>>> lm.load_doc("What is Mars?")
'Mars is a planet'

This can also be used to get a blend of context from stored documents:

>>> import languagemodels as lm
>>> lm.store_doc(lm.get_wiki("Python"), "Python")
>>> lm.store_doc(lm.get_wiki("C language"), "C")
>>> lm.store_doc(lm.get_wiki("Javascript"), "Javascript")
>>> lm.get_doc_context("What does it mean for batteries to be included in a language?")
'Python: It is often described as a "batteries included" language due to its comprehensive standard library.Guido van Rossum began working on Python in the late 1980s as a successor to the ABC programming language and first released it in 1991 as Python 0.9.

C: It was designed to be compiled to provide low-level access to memory and language constructs that map efficiently to machine instructions, all with minimal runtime support.

C: The book The C Programming Language, co-authored by the original language designer, served for many years as the de facto standard for the language.'

Performance

The models used by this package are 1000x smaller than the largest models in use today. They are useful as learning tools, but if you are expecting ChatGPT or similar performance, you will be very disappointed.

The base model should work on any system with 512MB of memory, but this memory limit can be increased. Setting this value higher will require more memory and generate results more slowly, but the results should be superior. Here's an example:

>>> import languagemodels as lm
>>> lm.do("If I have 7 apples then eat 5, how many apples do I have?")
'You have 8 apples.'
>>> lm.set_max_ram('4gb')
4.0
>>> lm.do("If I have 7 apples then eat 5, how many apples do I have?")
'I have 2 apples left.'

Full documentation

Commercial Use

This package itself is licensed for commerical use, but the models used may not be compatible with commercial use. In order to use this package commercially, you may want to filter models by their license type using the require_model_license function.

>>> import languagemodels as lm
>>> lm.do("What is your favorite animal.")
>>> "As an AI language model, I don't have preferences or emotions."
>>> lm.require_model_license("apache.*|mit")
>>> lm.do("What is your favorite animal.")
'Lion.'

The commercially-licensed models may not perform as well as the default models. It is recommended to confirm that the models used do meet the licensing requirements for your software.

Projects Ideas

This package can be used to do the heavy lifting for a number of learning projects:

  • CLI Chatbot (see examples/chat.py)
  • Streamlit chatbot (see examples/streamlitchat.py)
  • Chatbot with information retrieval
  • Chatbot with access to real-time information
  • Tool use
  • Text classification
  • Extractive question answering
  • Semantic search over documents
  • Document question answering

Several example programs and notebooks are included in the examples directory.

Attribution

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

languagemodels-0.6.0.tar.gz (15.6 kB view hashes)

Uploaded Source

Built Distribution

languagemodels-0.6.0-py3-none-any.whl (14.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page