Skip to main content

No project description provided

Project description


license: apache-2.0 pipeline_tag: text-generation language:

  • en tags:
  • pretrained inference: parameters: temperature: 0.7

Model Card for Mistral-7B-v0.1

The Mistral-7B-v0.1 Large Language Model (LLM) is a pretrained generative text model with 7 billion parameters. Mistral-7B-v0.1 outperforms Llama 2 13B on all benchmarks we tested.

For full details of this model please read our paper and release blog post.

Model Architecture

Mistral-7B-v0.1 is a transformer model, with the following architecture choices:

  • Grouped-Query Attention
  • Sliding-Window Attention
  • Byte-fallback BPE tokenizer

Troubleshooting

  • If you see the following error:
KeyError: 'mistral'
  • Or:
NotImplementedError: Cannot copy out of meta tensor; no data!

Ensure you are utilizing a stable version of Transformers, 4.34.0 or newer.

Notice

Mistral 7B is a pretrained base model and therefore does not have any moderation mechanisms.

The Mistral AI Team

Albert Jiang, Alexandre Sablayrolles, Arthur Mensch, Chris Bamford, Devendra Singh Chaplot, Diego de las Casas, Florian Bressand, Gianna Lengyel, Guillaume Lample, Lélio Renard Lavaud, Lucile Saulnier, Marie-Anne Lachaux, Pierre Stock, Teven Le Scao, Thibaut Lavril, Thomas Wang, Timothée Lacroix, William El Sayed.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

jota_os-0.1.0.tar.gz (5.0 kB view details)

Uploaded Source

Built Distribution

jota_os-0.1.0-py3-none-any.whl (6.4 kB view details)

Uploaded Python 3

File details

Details for the file jota_os-0.1.0.tar.gz.

File metadata

  • Download URL: jota_os-0.1.0.tar.gz
  • Upload date:
  • Size: 5.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.10.12

File hashes

Hashes for jota_os-0.1.0.tar.gz
Algorithm Hash digest
SHA256 f8593a321ba6bb400c77e86bf8ec4a833b082c3a18d7f06180c2c955906e8e31
MD5 32cd68d18a20d841d77dfb8dfcc6f72e
BLAKE2b-256 5fef457a07654b06c683b4f933b3c2b9cc665e778d8bfae803efb63712c72337

See more details on using hashes here.

File details

Details for the file jota_os-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: jota_os-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 6.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.10.12

File hashes

Hashes for jota_os-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5568d422017d705b3d883eaecee999872479f6fbb893e9eb9c9fc4732e900f2b
MD5 585b2e9fce315e4aa5b971d3934ba5c2
BLAKE2b-256 cc84c5532a9f3654e5fff6103c6863a6f96cc7759abe4f0f5931f8ccc09dc8a5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page