Skip to main content

Nano Llama

Project description

nanollama32

A compact and efficient implementation of the Llama 3.2 in a single file, featuring minimal dependencies—no transformers library required, even for tokenization.

Overview

nanollama32 provides a lightweight and straightforward implementation of the Llama model. It features:

  • Minimal dependencies
  • Easy-to-use interface
  • Efficient performance suitable for various applications

Quick Start

To get started, clone this repository and install the necessary packages.

pip install nanollama

Here’s a quick example of how to use nanollama32:

>>> from nanollama32 import Chat

# Initialize the chat instance
>>> chat = Chat()

# Start a conversation
>>> chat("What's the weather like in Busan?")
# Llama responds with information about the weather

# Follow-up question that builds on the previous context
>>> chat("And how about the temperature?")
# Llama responds with the temperature, remembering the previous context

# Another follow-up, further utilizing context
>>> chat("What should I wear?")
# Llama suggests clothing based on the previous responses

Command-Line Interface

You can also run nanollama32 from the command line:

nlm how to create a new conda env
# Llama responds with ways to create a new conda environment and prompts the user for further follow-up questions

Managing Chat History

  • --history: Specify the path to the JSON file where chat history will be saved and/or loaded from. If the file does not exist, a new one will be created.
  • --resume: Use this option to resume the conversation from a specific point in the chat history.

For example, you can specify 0 to resume from the most recent entry:

nlm "and to list envs?" --resume 0

Or, you can resume from a specific entry in history:

nlm "and to delete env?" --resume 20241026053144

Adding Text from Files

You can include text from any number of external files by using the {...} syntax in your input. For example, if you have a text file named langref.rst, you can include its content in your input like this:

nlm to create reddit bots {langref.rst}

License

This project is licensed under the MIT License. See the LICENSE file for more details.

Acknowledgements

This project builds upon the MLX implementation and Karpathy's LLM.c implementation of the Llama model. Special thanks to the contributors of both projects for their outstanding work and inspiration.

Contributing

Contributions are welcome! Feel free to submit issues or pull requests.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nanollama-0.0.3b0.tar.gz (7.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nanollama-0.0.3b0-py3-none-any.whl (8.1 kB view details)

Uploaded Python 3

File details

Details for the file nanollama-0.0.3b0.tar.gz.

File metadata

  • Download URL: nanollama-0.0.3b0.tar.gz
  • Upload date:
  • Size: 7.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for nanollama-0.0.3b0.tar.gz
Algorithm Hash digest
SHA256 9dcf85c5d84dba804b6174e6933bcf5aad0186e3f06603bc3804f8ff43c7addb
MD5 9aa743b9f789df0d6e34b6fe4dd8ee04
BLAKE2b-256 5d54f1779579ef59a0928ae21536abecd2a34a9ae1d0c9be4976296a1ec5038d

See more details on using hashes here.

File details

Details for the file nanollama-0.0.3b0-py3-none-any.whl.

File metadata

  • Download URL: nanollama-0.0.3b0-py3-none-any.whl
  • Upload date:
  • Size: 8.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for nanollama-0.0.3b0-py3-none-any.whl
Algorithm Hash digest
SHA256 c33b60d4215dee374738d6aefa361aeee370a9f97a9f3c241f1685b0f0df19f7
MD5 985781e842c4998bb5df2e95e6b2f217
BLAKE2b-256 29debc124d56cd782559b6cbf780749d7d47a4cb484834c03e9b18612e0c1a93

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page