Skip to main content

Integration utility for Mistral AI API to provide GPT-based functionalities.

Project description

PyPI version License: MIT Downloads

MistralGPTIntegration

MistralGPTIntegration is a Python package designed to provide GPT-based functionalities using the Mistral AI API. It enables users to quickly obtain comprehensive, context-aware responses from the model.

Installation

To install MistralGPTIntegration, you can use pip:

pip install MistralGPTIntegration

Usage

After installation, MistralGPTIntegration can be used in your Python scripts.

Example:

from mistralgptintegration import MistralGPTIntegration

api_key = "<your_api_key>"
mistral = MistralGPTIntegration(api_key)
prompt = "Once upon a time"
response = mistral.query_gpt(prompt)
print(response)
  • api_key: Your Mistral API key.
  • model_name: The name of the Mistral model to use. Defaults to mistral-tiny.
  • temperature: The temperature to use for the model. Defaults to 0.1.
  • top_p: The top_p to use for the model. Defaults to 1.0.
  • max_tokens: The maximum number of tokens to generate. Defaults to 150.

Customizing Your Queries

You can customize the behavior of MistralGPTIntegration by adjusting the parameters, such as the temperature, top_p, max_tokens, etc., to fit the specific needs of your queries or to tweak the behavior of the Mistral model.

Output Example

When you query the model, it processes your prompt and returns a response. Here is an example of the output:

{
  "id": "63213d34c61f4d96b893d7b1afc2b893",
  "object": "chat.completion",
  "created": 1706372087,
  "model": "mistral-small",
  "choices": [
    {
      "message": {
        "role": "assistant",
        "content": "some text"
      },
      "finish_reason": "stop",
      "index": 0
    }
  ],
  "usage": {
    "prompt_tokens": 318,
    "total_tokens": 622,
    "completion_tokens": 304
  }
}

Contributing

Contributions, issues, and feature requests are welcome! Feel free to check issues page.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

MistralGPTIntegration-0.0.3.tar.gz (4.2 kB view details)

Uploaded Source

Built Distribution

MistralGPTIntegration-0.0.3-py3-none-any.whl (5.1 kB view details)

Uploaded Python 3

File details

Details for the file MistralGPTIntegration-0.0.3.tar.gz.

File metadata

  • Download URL: MistralGPTIntegration-0.0.3.tar.gz
  • Upload date:
  • Size: 4.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.9

File hashes

Hashes for MistralGPTIntegration-0.0.3.tar.gz
Algorithm Hash digest
SHA256 dd1b31412422672c92214412e843aeeafab6a8bbcbca139f3890bba912183515
MD5 9cc975e60e1961cc9137cf609bc65ba7
BLAKE2b-256 ea9c3ce95bb59c78752f7ae230ffa622d485b22a5124b2bf1d8633a717041a2f

See more details on using hashes here.

File details

Details for the file MistralGPTIntegration-0.0.3-py3-none-any.whl.

File metadata

File hashes

Hashes for MistralGPTIntegration-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 8bf6a077d3bda8d185d33a386cb000d6f38f7e859bf2cd75dbf3fbeccb6e91ad
MD5 e5e6a7414d3bebced5b0123c0997770e
BLAKE2b-256 a8ff389736d14e199378608bee9d20a3fd9d949ec7c3dc20298e8651f97d5f04

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page