Skip to main content

A Python package to interact with Llama 3 locally using Ollama.

Project description

Llama3 Package

Overview

The Llama3 package allows you to interact with Meta's Llama 3 model locally using Ollama. The package automatically handles the installation and setup of Ollama and the Llama 3 model, allowing you to start using it with minimal effort.

Installation

Step 1: Install the Llama3 Package

You can install the Llama3 package using pip:

pip install llama3_package

Usage

The Llama3 package automatically installs Ollama, starts the Ollama server, pulls the Llama 3 model, and runs the model. You can interact with the model using the Llama3Model class.

Example

Here's a quick example to get you started:

from llama3 import Llama3Model

# Initialize the model
model = Llama3Model()

# Send a prompt to the model
response = model.prompt("5+5=")
print("Prompt Response:", response)

# Stream a prompt to the model
for chunk in model.stream_prompt("Tell me a joke"):
    print("Stream Prompt Response:", chunk)

How It Works

  1. Automatic Installation of Ollama: If Ollama is not installed on your system, the package will automatically download and install it.
    • On Linux, it uses the command: curl -fsSL https://ollama.com/install.sh | sh
    • On macOS, it uses the command: brew install ollama
  2. Starting Ollama Server: The package starts the Ollama server in the background and verifies it is running.
  3. Pulling the Llama 3 Model: The package ensures the Llama 3 model is pulled and ready to use.
  4. Running the Model: The Ollama service is started in the background and managed by the package.

Configuration

You can configure the model using environment variables. For example, to use a different version of the Llama 3 model, you can set the LLAMA3_MODEL_NAME environment variable:

export LLAMA3_MODEL_NAME="llama3-70B"

Troubleshooting

If you encounter any issues with the package, please ensure that:

  • You have an active internet connection for downloading and pulling the model.
  • Your system meets the requirements for running Ollama.

For further assistance, please open an issue on our GitHub repository.

Example Test Script

You can also use the following test script to verify the functionality:

import sys
import os
import unittest
sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), '../src')))

from llama3.model import Llama3Model

class TestLlama3Model(unittest.TestCase):
    @classmethod
    def setUpClass(cls):
        cls.model = Llama3Model()

    @classmethod
    def tearDownClass(cls):
        del cls.model

    def test_prompt(self):
        response = self.model.prompt("5+5=")
        print("Prompt Response:", response)
        self.assertIn("10", response.lower())

if __name__ == "__main__":
    unittest.main()

Contributing

We welcome contributions! Please see the CONTRIBUTING.md for more details.

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama3_package-0.3.0.tar.gz (5.3 kB view details)

Uploaded Source

Built Distribution

llama3_package-0.3.0-py3-none-any.whl (4.9 kB view details)

Uploaded Python 3

File details

Details for the file llama3_package-0.3.0.tar.gz.

File metadata

  • Download URL: llama3_package-0.3.0.tar.gz
  • Upload date:
  • Size: 5.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.8.18

File hashes

Hashes for llama3_package-0.3.0.tar.gz
Algorithm Hash digest
SHA256 457769b3307b71a8a756d6b4826d636e56291c4f8dd9f7a1e0c31feb637beb7e
MD5 366556bd114054eca2c1957947fd32d7
BLAKE2b-256 d97a02a408a924ba9316eaea7a0c46ca2d4d2161d88eb305ed8c597a4bb3fad3

See more details on using hashes here.

File details

Details for the file llama3_package-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llama3_package-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ebec6fd963b04b3cbf8549bd23f4516c2a3c985c7ee51ba9188fa6ff92b5e76f
MD5 64dd1e69c5fdd5bc3db01d929f01453a
BLAKE2b-256 3adcf1c26749c77c12f07034be46f2f9060c571a231e5d10bb6d7e6d83b0d998

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page