Skip to main content

LLM plugin to access Google's Gemini family of models

Project description

llm-gemini

PyPI Changelog Tests License

API access to Google's Gemini models

Installation

Install this plugin in the same environment as LLM.

llm install llm-gemini

Usage

Configure the model by setting a key called "gemini" to your API key:

llm keys set gemini
<paste key here>

Now run the model using -m gemini-1.5-pro-latest, for example:

llm -m gemini-1.5-pro-latest "A joke about a pelican and a walrus"

A pelican walks into a seafood restaurant with a huge fish hanging out of its beak. The walrus, sitting at the bar, eyes it enviously.

"Hey," the walrus says, "That looks delicious! What kind of fish is that?"

The pelican taps its beak thoughtfully. "I believe," it says, "it's a billfish."

To chat interactively with the model, run llm chat:

llm chat -m gemini-1.5-pro-latest

Other models are:

  • gemini-1.5-flash-latest
  • gemini-1.5-flash-8b-latest` - the least expensive

Gemini models are multi-modal. You can provide images, audio or video files as input like this:

llm -m gemini-1.5-flash-latest 'extract text' -a image.jpg

Or with a URL:

llm -m gemini-1.5-flash-8b-latest 'describe image' \
  -a https://static.simonwillison.net/static/2024/pelicans.jpg

Embeddings

The plugin also adds support for the text-embedding-004 embedding model.

Run that against a single string like this:

llm embed -m text-embedding-004 -c 'hello world'

This returns a JSON array of 768 numbers.

This command will embed every README.md file in child directories of the current directory and store the results in a SQLite database called embed.db in a collection called readmes:

llm embed-multi readmes --files . '*/README.md' -d embed.db -m text-embedding-004

You can then run similarity searches against that collection like this:

llm similar readmes -c 'upload csvs to stuff' -d embed.db

See the LLM embeddings documentation for further details.

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd llm-gemini
python3 -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

llm install -e '.[test]'

To run the tests:

pytest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_gemini-0.3a0.tar.gz (8.4 kB view details)

Uploaded Source

Built Distribution

llm_gemini-0.3a0-py3-none-any.whl (8.8 kB view details)

Uploaded Python 3

File details

Details for the file llm_gemini-0.3a0.tar.gz.

File metadata

  • Download URL: llm_gemini-0.3a0.tar.gz
  • Upload date:
  • Size: 8.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.13.0

File hashes

Hashes for llm_gemini-0.3a0.tar.gz
Algorithm Hash digest
SHA256 bfd129c122c2aed3ab8c7eaab45b12111a06fdd28bd7a8835e1bb944a9cb01c3
MD5 c5d77ffdf528fdd3ccc877ccd7e55168
BLAKE2b-256 c5f68cc851d3431926ee4225c04f3c2f0662fb37096cbf0c688c901de9ab4b13

See more details on using hashes here.

File details

Details for the file llm_gemini-0.3a0-py3-none-any.whl.

File metadata

  • Download URL: llm_gemini-0.3a0-py3-none-any.whl
  • Upload date:
  • Size: 8.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.13.0

File hashes

Hashes for llm_gemini-0.3a0-py3-none-any.whl
Algorithm Hash digest
SHA256 b6b7669e4305219499547f35812715c42be9c2de297c548984c1bf9ba76d65fa
MD5 2427f0ee9366c60ae00fa7cdae0bc52d
BLAKE2b-256 827eacb13df1fe78e2cfbef66e0f5b7ce16612fb0d0f2bae6388df89fea09dc5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page