Transformers based NLP models
Project description
NLP Models
A repository for building transformer based nlp models
Installation
Install from PyPi
pip install nlp-models
Install from source
git clone git@github.com:minggnim/nlp-models.git
pip install -r requirements
Llama2 Quantization model on consumer CPU
Run Chat applications on CPU
-
Streamlit UI
cd apps streamlit run chat.py
-
Command line
llm_app chat -s 'hi there'
Run Q&A application on CPU
-
Steamlit UI
cd apps streamlit run qa.py
Models
-
bert_classifier
A wrapper package around BERT-based classification models -
multi_task_model
An implementation of multi-tasking model built on encoder models -
GPT-2
-
Falcon 7B
-
Quantized Llama2 models
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
nlp-models-4.2.1.tar.gz
(15.6 kB
view hashes)
Built Distribution
nlp_models-4.2.1-py3-none-any.whl
(19.1 kB
view hashes)
Close
Hashes for nlp_models-4.2.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3919fef218fa2a5a8ae200f754994a356d80067fd65a3cd0d50f8939409fbeaf |
|
MD5 | 04a510505bf5572b148184d00d30cd6e |
|
BLAKE2b-256 | 7a3e2dc49fdcde2a7c65fb2863ce8a7745c567ce5a76e4904c5bbcb94e7f6641 |