JAX implementation of the Mistral model.
Project description
Mistral JAX
Usage
Simple installation from PyPI.
pip install mistral-jax
Import MistralLMParams and MistralModelParams.
from mistral import MistralLMParams, MistralModelParams
Roadmap
- Model architecture
- Publish a Python library
- Model parallelism
- Generation
- KV cache
- Sampling
- Training
Install
This project requires Python 3.11, JAX 0.4.20.
Create venv:
python3.11 -m venv venv
Install dependencies:
CPU:
pip install -U pip
pip install -U wheel
pip install "jax[cpu]"
pip install --pre torch --index-url https://download.pytorch.org/whl/nightly/cpu
pip install git+https://github.com/huggingface/transformers
pip install -r requirements.txt
CUDA 11:
pip install -U pip
pip install -U wheel
pip install "jax[cuda11_pip]" -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html
pip install --pre torch --index-url https://download.pytorch.org/whl/nightly/cu118
pip install git+https://github.com/huggingface/transformers
pip install -r requirements.txt
TPU VM:
pip install -U pip
pip install -U wheel
pip install "jax[tpu]" -f https://storage.googleapis.com/jax-releases/libtpu_releases.html
pip install --pre torch --index-url https://download.pytorch.org/whl/nightly/cpu
pip install git+https://github.com/huggingface/transformers
pip install -r requirements.txt
Model architecture
MistralForCausalLM(
(model): MistralModel(
(embed_tokens): Embedding(32000, 4096)
(layers): ModuleList(
(0-31): 32 x MistralDecoderLayer(
(self_attn): MistralAttention(
(q_proj): Linear(in_features=4096, out_features=4096, bias=False)
(k_proj): Linear(in_features=4096, out_features=1024, bias=False)
(v_proj): Linear(in_features=4096, out_features=1024, bias=False)
(o_proj): Linear(in_features=4096, out_features=4096, bias=False)
(rotary_emb): MistralRotaryEmbedding()
)
(mlp): MistralMLP(
(gate_proj): Linear(in_features=4096, out_features=14336, bias=False)
(up_proj): Linear(in_features=4096, out_features=14336, bias=False)
(down_proj): Linear(in_features=14336, out_features=4096, bias=False)
(act_fn): SiLUActivation()
)
(input_layernorm): MistralRMSNorm()
(post_attention_layernorm): MistralRMSNorm()
)
)
(norm): MistralRMSNorm()
)
(lm_head): Linear(in_features=4096, out_features=32000, bias=False)
)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
mistral-jax-0.0.2.tar.gz
(7.9 kB
view hashes)
Built Distribution
Close
Hashes for mistral_jax-0.0.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9dcf166a0f48da12c2966694cba5611e0a84dbf1bb6542cd1b55db24cf16c42e |
|
MD5 | ad2856ac042dbc19447da7391d9018b6 |
|
BLAKE2b-256 | 404f93af5bbdd0ce5b45c00852ceaeed53098cdc47ad6a02377db7312684103e |