Skip to main content

cLLM is an Open-source library that use llama-cpp-python and llama.cpp and provide a Low and High level API and allow developer to be more pythonic.

Project description

cLLM

cLLM is an Open-source library that use llama-cpp-python and llama.cpp and provide a Low and High level API and allow developer to be more pythonic.

Features 🔮

  • C++ Llama.cpp GGML Framework: The program is built using the C++ language and utilizes the Llama.cpp framework for efficient performance.

  • EasyDeL Platform: if you use the provided open-source models The models have been trained using the EasyDeL platform, ensuring high-quality and accurate assistance.

  • Customized Models: Users can access models customized for their specific needs, such as coding assistance, grammar correction, and more.

  • OpenAI API: The structure of APIs that will be provided in upcoming version will be OpenAI API like.

Installation with Specific Hardware Acceleration (BLAS, CUDA, Metal, etc.)

[!TIP] The default behavior for llama.cpp installation is to build for CPU only on Linux and Windows and to use Metal on macOS. However, llama.cpp supports various hardware acceleration backends such as OpenBLAS, cuBLAS, CLBlast, HIPBLAS, and Metal.

To install with a specific hardware acceleration backend, you can set the CMAKE_ARGS environment variable before installing. Here are the instructions for different backends:

Buildings for OpenBLAS

CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install cLLM-python

Buildings for cuBLAS

CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install cLLM-python

Buildings for Metal

CMAKE_ARGS="-DLLAMA_METAL=on" pip install cLLM-python

Buildings for CLBlast

CMAKE_ARGS="-DLLAMA_CLBLAST=on" pip install cLLM-python

Buildings for hipBLAS

CMAKE_ARGS="-DLLAMA_HIPBLAS=on" pip install cLLM-python

You can set the CMAKE_ARGS environment variable accordingly based on your specific hardware acceleration requirements before installing llama.cpp.

Contributing

If you would like to contribute to cLLM, please follow the guidelines outlined in the CONTRIBUTING.md file in the repository.

License

cLLM is licensed under the MIT. See the LICENSE.md file for more details.

Support

For any questions or issues, please get in touch with me at erfanzare810@gmail.com.

Thank you for using cLLM! We hope it will help you have a personal computer experience.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cLLM-python-0.0.8.tar.gz (21.0 kB view hashes)

Uploaded Source

Built Distribution

cLLM_python-0.0.8-py3-none-any.whl (26.8 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page