Python wrapper for Chrome's built-in AI Prompt API
Project description
wiki3-ai
Python wrapper for Chrome's built-in AI Prompt API, enabling direct access to browser-provided language models from Jupyter notebooks.
Features
- 🤖 Direct access to Chrome's built-in AI via the Prompt API
- 📓 Seamless Jupyter integration using AnyWidget
- 🔄 Bidirectional communication via traitlets
- 🎯 Type-safe Python API matching the Web IDL specification
- 🌊 Support for streaming responses
- 🛠️ Tool use capabilities
- 🖼️ Multimodal input support (text, images, audio)
- 📊 Session management and cloning
- 🔒 Privacy-focused on-device processing
Installation
pip install wiki3-ai
Requirements
- Python 3.10+
- Chrome browser with Prompt API enabled
- Running in a Jupyter environment (JupyterLab, Jupyter Notebook, etc.)
Quick Start
from wiki3_ai import LanguageModel
# Create a language model session
session = await LanguageModel.create()
# Simple prompt
result = await session.prompt("Write me a poem about Python.")
print(result)
# Streaming response
async for chunk in session.prompt_streaming("Write me a long story."):
print(chunk, end="", flush=True)
Usage Examples
System Prompts
from wiki3_ai import LanguageModel, LanguageModelMessage, LanguageModelMessageRole
session = await LanguageModel.create({
"initialPrompts": [
{
"role": "system",
"content": "You are a helpful Python programming assistant."
}
]
})
response = await session.prompt("How do I read a file in Python?")
print(response)
Checking Availability
from wiki3_ai import LanguageModel, Availability
# Check if the API is available
availability = await LanguageModel.availability()
print(f"Model availability: {availability}")
if availability == Availability.AVAILABLE:
session = await LanguageModel.create()
# Use the session...
Configuring Temperature and Top-K
# Get default parameters
params = await LanguageModel.params()
print(f"Default temperature: {params.default_temperature}")
print(f"Max temperature: {params.max_temperature}")
# Create session with custom parameters
session = await LanguageModel.create({
"temperature": 0.8,
"topK": 40
})
Session Management
# Create a session
session = await LanguageModel.create()
# Use the session
result1 = await session.prompt("Tell me about AI.")
# Clone for different conversation branches
session2 = await session.clone()
result2 = await session2.prompt("Now tell me about machine learning.")
# Destroy when done
await session.destroy()
Measuring Token Usage
session = await LanguageModel.create()
# Check current usage
print(f"Current usage: {session.input_usage}/{session.input_quota}")
# Measure potential usage before prompting
usage = await session.measure_input_usage("This is my prompt")
print(f"This prompt would use: {usage} tokens")
# Prompt if there's enough quota
if session.input_usage + usage < session.input_quota:
result = await session.prompt("This is my prompt")
Structured Output with JSON Schema
schema = {
"type": "object",
"required": ["rating"],
"properties": {
"rating": {
"type": "number",
"minimum": 0,
"maximum": 5
}
}
}
result = await session.prompt(
"Rate this: The food was excellent!",
{"responseConstraint": schema}
)
import json
data = json.parse(result)
print(f"Rating: {data['rating']}")
API Reference
LanguageModel
Main class for interacting with the language model.
Class Methods
create(options=None)- Create a new language model sessionavailability(options=None)- Check model availabilityparams()- Get model parameters
Instance Methods
prompt(input, options=None)- Send a prompt and get responseprompt_streaming(input, options=None)- Send a prompt and stream responseappend(input, options=None)- Append messages without getting responsemeasure_input_usage(input, options=None)- Measure token usageclone(options=None)- Clone the sessiondestroy()- Destroy the session
Properties
input_usage- Current token usageinput_quota- Maximum token quotatop_k- Top-K sampling parametertemperature- Temperature sampling parameter
Data Models
LanguageModelMessage- A message in the conversationLanguageModelMessageContent- Content with type and valueLanguageModelMessageRole- Message role (system/user/assistant)LanguageModelMessageType- Content type (text/image/audio)LanguageModelCreateOptions- Options for creating sessionsLanguageModelPromptOptions- Options for promptingLanguageModelParams- Model parametersAvailability- Availability status enum
Architecture
This package uses:
- AnyWidget for Jupyter integration
- Traitlets for bidirectional Python ↔ JavaScript communication
- Chrome Prompt API for accessing built-in language models
The communication flow:
- Python code calls methods on
LanguageModel - Requests are serialized via traitlets to JavaScript
- JavaScript calls Chrome's native Prompt API
- Results are sent back via traitlets to Python
- Python code receives async results
Specifications
This implementation follows:
License
Apache License 2.0 - See LICENSE file for details
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Links
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file wiki3_ai-0.5.1.tar.gz.
File metadata
- Download URL: wiki3_ai-0.5.1.tar.gz
- Upload date:
- Size: 56.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.2.1 CPython/3.14.0 Linux/6.11.0-1018-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b05dce6080ac5e4de1e351d17c4c3a98dcc9401f1406cfc9f97276e54f515c2a
|
|
| MD5 |
13fde19bf062ffdf57c700dd3b543f3a
|
|
| BLAKE2b-256 |
9a6bd6f15a197062b45b6ec91d332b5d727ede13e3e9985762a7f06f2ff03bc1
|
File details
Details for the file wiki3_ai-0.5.1-py3-none-any.whl.
File metadata
- Download URL: wiki3_ai-0.5.1-py3-none-any.whl
- Upload date:
- Size: 16.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.2.1 CPython/3.14.0 Linux/6.11.0-1018-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ccff044f7e575d1ea7e26b9bb3c12c2784a0dbb3ab9ae659b1ef23c09785337c
|
|
| MD5 |
db990576caba43049eaf26a5903dce60
|
|
| BLAKE2b-256 |
58ef86abfa01b45fe29802c78ccc59736dc666ebe5e235b3d91c643d0581ee60
|