Skip to main content

An LLM assistant magic command for Google Colab notebooks.

Project description

colab-ask 🪄

Ask Example Animation

Colab-Ask is a magic command that sends your code, outputs, and even generated images (matplotlib, etc.) directly to models like Claude or GPT-4 for context-aware help.

Why use this?

  • Context Aware: It sees your text and code cells, and error traces.
  • Vision Capable: It can see embedded images and graphs.
  • Native Integration: Lives inside Colab. No alt-tabbing to ChatGPT.
  • Privacy First: Your data goes straight to the API. No middleman servers.

Prerequisites (Before you start)

  1. Google Colab: This extension is designed specifically for the Colab environment.
  2. API Keys: You need an API key for your preferred provider (OpenAI, Anthropic, Gemini, etc.).
    • Click the Key Icon (Secrets) on the left sidebar in Colab.
    • Add a new secret (e.g., ANTHROPIC_API_KEY or OPENAI_API_KEY).
    • Toggle "Notebook access" ON.

Quick Start

Run this in a cell to install and load the extension:

!pip install colab-ask
%load_ext colab-ask

1. The Basic Ask

Use %%ask to chat about your current notebook state.

%%ask
My training loop is stalling at epoch 5. based on the logs above, why?

2. The Vision Ask

Since colab-ask sees images, you can ask about plots:

# (After generating a matplotlib chart)
%%ask
Look at the plot above. Is the model overfitting?

Configuration

Change Model: Uses LiteLLM under the hood. Any supported model string works. Check https://docs.litellm.ai/docs/providers

%set_model claude-3-5-sonnet-20241022

Set System Prompt: Want a specific teaching style?

%%set_sys
You are a senior Python engineer. Be concise. 
Focus on performance optimization and vectorized operations.

Default System Prompt

By default, colab-ask uses this system prompt:

You are an AI assistant inside a Google Colab notebook.
In your response, craft guidance on the next step, whether code related, or more strategy related.
Don't spew out all the steps at once, the user wants to go slow, they will ask for more if they need it.
The user is interested in improving their coding, and may choose to make code blocks in response to your input.

This prompt is inspired by SolveIt (the fast.ai dialog engineering environment), which emphasizes:

  • Small steps: Breaking down problems incrementally
  • Interactive learning: Waiting for user feedback before proceeding
  • Code-focused: Encouraging hands-on implementation

Privacy & Data

  • Direct Communication: Notebook data is sent directly from your browser/runtime to the LLM provider (OpenAI/Anthropic/Google).
  • Zero Logs: We do not run a middleware server. We do not store your code.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

colab_ask-0.1.5.tar.gz (7.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

colab_ask-0.1.5-py3-none-any.whl (7.4 kB view details)

Uploaded Python 3

File details

Details for the file colab_ask-0.1.5.tar.gz.

File metadata

  • Download URL: colab_ask-0.1.5.tar.gz
  • Upload date:
  • Size: 7.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for colab_ask-0.1.5.tar.gz
Algorithm Hash digest
SHA256 86bfe089d10bd37befce9b727a0a1fabf6452a615fbb3ee0ebe2c51a92bb6794
MD5 e1a9d9863647998083cd6130bc1c0e32
BLAKE2b-256 ab2d121a8ff931322a0fecac440695a16782c447f9e2922f475262f1f33b0ad8

See more details on using hashes here.

File details

Details for the file colab_ask-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: colab_ask-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 7.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for colab_ask-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 fb2dd58c8517a106c77e29f62ad03368c11326da6d18c1f0c61fb018f590bd08
MD5 2d31885cdf7f1ecc83f87d1e339a2831
BLAKE2b-256 0d9eb00b953f5cdcf4b6ef7bf744d2362ae4984cebe7a6a304ee923c4712ce3f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page