Skip to main content

An LLM assistant magic command for Google Colab notebooks.

Project description

colab-ask 🪄

Ask Example Animation

Colab-Ask is a magic command that sends your code, outputs, and even generated images (matplotlib, etc.) directly to models like Claude or GPT-4 for context-aware help.

Why use this?

  • Context Aware: It sees your text and code cells, and error traces.
  • Vision Capable: It can see embedded images and graphs.
  • Native Integration: Lives inside Colab. No alt-tabbing to ChatGPT.
  • Privacy First: Your data goes straight to the API. No middleman servers.

Prerequisites (Before you start)

  1. Google Colab: This extension is designed specifically for the Colab environment.
  2. API Keys: You need an API key for your preferred provider (OpenAI, Anthropic, Gemini, etc.).
    • Click the Key Icon (Secrets) on the left sidebar in Colab.
    • Add a new secret (e.g., ANTHROPIC_API_KEY or OPENAI_API_KEY).
    • Toggle "Notebook access" ON.

Quick Start

Run this in a cell to install and load the extension:

!pip install colab-ask
%load_ext colab-ask

1. The Basic Ask

Use %%ask to chat about your current notebook state.

%%ask
My training loop is stalling at epoch 5. based on the logs above, why?

2. The Vision Ask

Since colab-ask sees images, you can ask about plots:

# (After generating a matplotlib chart)
%%ask
Look at the plot above. Is the model overfitting?

Configuration

Change Model: Uses LiteLLM under the hood. Any supported model string works. Check https://docs.litellm.ai/docs/providers

%set_model claude-3-5-sonnet-20241022

Set System Prompt: Want a specific teaching style?

%%set_sys
You are a senior Python engineer. Be concise. 
Focus on performance optimization and vectorized operations.

Default System Prompt

By default, colab-ask uses this system prompt:

You are an AI assistant inside a Google Colab notebook.
In your response, craft guidance on the next step, whether code related, or more strategy related.
Don't spew out all the steps at once, the user wants to go slow, they will ask for more if they need it.
The user is interested in improving their coding, and may choose to make code blocks in response to your input.

This prompt is inspired by SolveIt (the fast.ai dialog engineering environment), which emphasizes:

  • Small steps: Breaking down problems incrementally
  • Interactive learning: Waiting for user feedback before proceeding
  • Code-focused: Encouraging hands-on implementation

Privacy & Data

  • Direct Communication: Notebook data is sent directly from your browser/runtime to the LLM provider (OpenAI/Anthropic/Google).
  • Zero Logs: We do not run a middleware server. We do not store your code.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

colab_ask-0.1.6.tar.gz (7.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

colab_ask-0.1.6-py3-none-any.whl (7.5 kB view details)

Uploaded Python 3

File details

Details for the file colab_ask-0.1.6.tar.gz.

File metadata

  • Download URL: colab_ask-0.1.6.tar.gz
  • Upload date:
  • Size: 7.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for colab_ask-0.1.6.tar.gz
Algorithm Hash digest
SHA256 23a93186d31b69bfea68d8ed35e426b7187dfc899a4a94fe3641cc21e0c3e1ab
MD5 09d91a4245af03dabd13dc4185712e9c
BLAKE2b-256 6bd35b90e45e0bfd0430e3c64966be621252b0f3504586a44d484031e25b792c

See more details on using hashes here.

File details

Details for the file colab_ask-0.1.6-py3-none-any.whl.

File metadata

  • Download URL: colab_ask-0.1.6-py3-none-any.whl
  • Upload date:
  • Size: 7.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for colab_ask-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 60b727c38b701ce46fa81578859575b2e49c65861d3960733ca5b6babe6745b9
MD5 f0015e644d4164e54e8347cd46209baf
BLAKE2b-256 02919a8675f7942a6bf052a4e66937406e42626504e3034e3e0a5a87eadfd31e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page