Skip to main content

google gemini api for terminal (CLI)

Project description

Gemini CLI

PyPI

Gemini is a Google Generative AI API for the terminal. It allows streaming responses from Google Generative AI models, designed to harness the capabilities of Google's Generative AI for creating rich, contextually relevant content. It's tailored for users seeking a straightforward, efficient means of leveraging AI for content generation, offering a seamless integration with Google's Generative AI API. With Gemini CLI, you can prompt the AI to craft stories, generate ideas, or even compose detailed texts, all from the comfort of your terminal.

Getting Started

Installation

To install Gemini, run:

pip install gemini-cli

Usage

To use Gemini, simply run the gemini-cli command followed by the prompt you want to send to the model. For example:

gemini-cli "Write a story about a robot who falls in love with a human."

This will send the prompt "Write a story about a robot who falls in love with a human." to the model, and the model's response will be printed to the terminal.

You can also provide a token for authentication using the --token flag. This is required if you want to use a model that is not available for public use. To get a token, follow the instructions on the Google Generative AI website.

Prompt Input (param or stdin):
    Use Case: You're a writer looking for creative inspiration to start a new story. You can use the --prompt option to feed the AI a starting point, like --prompt "In a world where dragons are pets,", and let the AI generate an intriguing storyline continuation.

API Token (--token):
    Use Case: As a developer working in different environments (development, staging, production), you might need to use different API tokens. The --token option allows you to specify the token directly in the command line for quick switches without changing the configuration file, like --token "your_api_token_here".

Configuration File (-f/--config-file):
    Use Case: You're managing multiple projects with different configuration needs. The -f/--config-file option allows you to specify a custom path to a configuration file, enabling you to maintain separate configurations for each project. For instance, you can use --config-file "path/to/project_a_config.toml" for one project and switch to another with --config-file "path/to/project_b_config.toml".

generation_config in TOML File:

    top_p, top_k, candidate_count, max_output_tokens, stop_sequences:
        Use Case: You're fine-tuning the AI's content generation for a chatbot application. You need the responses to be concise and contextually appropriate without veering off-topic. By adjusting top_p, top_k, candidate_count, max_output_tokens, and stop_sequences in the gemini.toml file, you can control the randomness, length, and termination of the AI-generated responses to fit the chatbot's conversational flow.

Examples

Here are a few examples of how you can use Gemini:

  • Write a story about a robot who falls in love with a human.
  • Generate a poem about the beauty of nature.
  • Translate a sentence from English to Spanish.
  • Summarize a news article.
  • Write a song about a lost love.

How to Uage

Features

  • Stream responses: Gemini streams responses from the model, so you can see the model's output as it is being generated.
  • Authentication: Gemini supports authentication with a token, so you can use models that are not available for public use.
  • Command-line interface: Gemini provides a simple command-line interface that is easy to use.

Contributing

Gemini is open source and contributions are welcome. To contribute, please read the contributing guidelines.

License

Gemini is licensed under the MIT License.%

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gemini_cli-0.2.5.tar.gz (5.1 kB view details)

Uploaded Source

Built Distribution

gemini_cli-0.2.5-py3-none-any.whl (6.5 kB view details)

Uploaded Python 3

File details

Details for the file gemini_cli-0.2.5.tar.gz.

File metadata

  • Download URL: gemini_cli-0.2.5.tar.gz
  • Upload date:
  • Size: 5.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for gemini_cli-0.2.5.tar.gz
Algorithm Hash digest
SHA256 353085aad5286bcba5dc4855aa5c95e34cc22b7ea9562362dfc32f8c864dd02d
MD5 819d10c8af87da3057b24d04b0da65e1
BLAKE2b-256 099eda607b592f8d3d56caeb3e74b76e6a2e6b14096ca1e9dedecddf3628bae8

See more details on using hashes here.

File details

Details for the file gemini_cli-0.2.5-py3-none-any.whl.

File metadata

  • Download URL: gemini_cli-0.2.5-py3-none-any.whl
  • Upload date:
  • Size: 6.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for gemini_cli-0.2.5-py3-none-any.whl
Algorithm Hash digest
SHA256 5fb84af47b09266b1c794425386cbcc35d11f8d3c3af1ee2794b9e30174673e0
MD5 b617c67ac9eac008f4fbf35a40dde6f1
BLAKE2b-256 8f65953aa5b1a935944360e5738cc103874aabb21fa52b1fdd34478cbe0af763

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page