Transformers based NLP models
Project description
NLP Models
A repository for building transformer based nlp models
Run Llama2 on consumer CPU
Run Chat UI on CPU
cd pipelines/nlp_models/
streamlit run app.py
Run Chat cmd line on CPU
llm_app chat -s 'hi there'
Models
-
bert_classifier A wrapper package around BERT-based classification models
-
multi_task_model An implementation of multi-tasking model built on encoder models
-
GPT-2
Falcon 7B
- Quantized Llama2 models
Installation
Install from PyPi
pip install nlp-models
Install from source
git clone git@github.com:minggnim/nlp-models.git
pip install -r requirements
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
nlp-models-4.1.0.tar.gz
(15.5 kB
view hashes)
Built Distribution
nlp_models-4.1.0-py3-none-any.whl
(19.1 kB
view hashes)
Close
Hashes for nlp_models-4.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6c4e08866f56b2d0f6d90693be92194c1f631060cc43b644e7297642229c4198 |
|
MD5 | bbc51fe4139f8fa73c9a4a1b356b4fba |
|
BLAKE2b-256 | 975b2c399e432d7d1bd3aee3798a1f46cb6b7403d9c7be8b5ee64dc741b5e131 |