LLM plugin to access Google's Gemini family of models
Project description
llm-gemini
API access to Google's Gemini models
Installation
Install this plugin in the same environment as LLM.
llm install llm-gemini
Usage
Configure the model by setting a key called "gemini" to your API key:
llm keys set gemini
<paste key here>
Now run the model using -m gemini-pro
, for example:
llm -m gemini-pro "A joke about a pelican and a walrus"
Why did the pelican get mad at the walrus?
Because he called him a hippo-crit.
To chat interactively with the model, run llm chat
:
llm chat -m gemini-pro
Development
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd llm-gemini
python3 -m venv venv
source venv/bin/activate
Now install the dependencies and test dependencies:
llm install -e '.[test]'
To run the tests:
pytest
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
llm-gemini-0.1a0.tar.gz
(7.0 kB
view hashes)
Built Distribution
Close
Hashes for llm_gemini-0.1a0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 79b5dba829cc5b69318b68ee50f17d26f785ec2f16aff7d114f4b816a654ad90 |
|
MD5 | eb43378b450b39dd63308c93e9bebb23 |
|
BLAKE2b-256 | c53692cb4e5979175a27f34aec28f204b16ac7493bbd6d9784f90d7ab39e7716 |