Skip to main content

llama-index llms groq integration

Project description

LlamaIndex Llms Integration: Groq

Welcome to Groq! 🚀 At Groq, we've developed the world's first Language Processing Unit™, or LPU. The Groq LPU has a deterministic, single core streaming architecture that sets the standard for GenAI inference speed with predictable and repeatable performance for any given workload.

Beyond the architecture, our software is designed to empower developers like you with the tools you need to create innovative, powerful AI applications. With Groq as your engine, you can:

  • Achieve uncompromised low latency and performance for real-time AI and HPC inferences 🔥
  • Know the exact performance and compute time for any given workload 🔮
  • Take advantage of our cutting-edge technology to stay ahead of the competition 💪

Want more Groq? Check out our website for more resources and join our Discord community to connect with our developers!

Develop

To create a development environment, install poetry then run:

poetry install --with dev

Testing

To test the integration, first enter the poetry venv:

poetry shell

Then tests can be run with make

make test

Integration tests

Integration tests will be skipped unless an API key is provided. API keys can be created ath the groq console. Once created, store the API key in an environment variable and run tests

export GROQ_API_KEY=<your key here>
make test

Linting and Formatting

Linting and code formatting can be executed with make.

make format
make lint

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_llms_groq-0.5.0.tar.gz (4.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_index_llms_groq-0.5.0-py3-none-any.whl (3.7 kB view details)

Uploaded Python 3

File details

Details for the file llama_index_llms_groq-0.5.0.tar.gz.

File metadata

  • Download URL: llama_index_llms_groq-0.5.0.tar.gz
  • Upload date:
  • Size: 4.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_llms_groq-0.5.0.tar.gz
Algorithm Hash digest
SHA256 54754dda36a7a9eed4c3650b6af2a3c7ebd69481de0b8fc11de2811e2d8ab938
MD5 bb52c241a5d895ed0e7e4d391b4710e5
BLAKE2b-256 ee2cc5b71db6f81e6a65a435a40de81a0f3ea96040e8acb131243b3352d70529

See more details on using hashes here.

File details

Details for the file llama_index_llms_groq-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: llama_index_llms_groq-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 3.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_llms_groq-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b1b2df6541e716a9688c856b036917ffac93454ffa1efe80b8ac6eda61cf02ae
MD5 7fd3ad5dd3e715c359301330a36cd208
BLAKE2b-256 0f379bf6a201f19bebf153a77612c2e7947713d94bcbf14ee96ad173cd4664dd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page