Index - SOTA browser AI agent for autonomous task execution on the web
Project description
Index
Index is the SOTA open-source browser agent for autonomously executing complex tasks on the web.
- Powered by reasoning LLMs with vision capabilities.
- Claude 3.7 Sonnet with extended thinking (top performing model)
- OpenAI o4-mini
- Gemini models (upcoming)
-
pip install lmnr-indexand use it in your project -
index runto run the agent in the interactive CLI - Index is also available as a serverless API.
- You can also try out Index via Chat UI or fully self-host the chat UI.
- Supports advanced browser agent observability powered by open-source platform Laminar.
prompt: go to ycombinator.com. summarize first 3 companies in the W25 batch and make new spreadsheet in google sheets.
https://github.com/user-attachments/assets/2b46ee20-81b6-4188-92fb-4d97fe0b3d6a
Index API
The easiest way to use Index in production is via the serverless API. Index API manages remote browser sessions, agent infrastructure and browser observability. To get started, sign up and create project API key. Read the docs to learn more.
Install Laminar
pip install lmnr
Use Index via API
from lmnr import Laminar, AsyncLaminarClient
# you can also set LMNR_PROJECT_API_KEY environment variable
# Initialize tracing
Laminar.initialize(project_api_key="your_api_key")
# Initialize the client
client = AsyncLaminarClient(api_key="your_api_key")
async def main():
response = await client.agent.run(
prompt="Navigate to news.ycombinator.com, find a post about AI, and summarize it"
)
print(response.result)
if __name__ == "__main__":
asyncio.run(main())
Local Quick Start
Install dependencies
pip install lmnr-index
# Install playwright
playwright install chromium
Run the agent with CLI
You can run Index via interactive CLI. It features:
- Browser state persistence between sessions
- Follow-up messages with support for "give human control" action
- Real-time streaming updates
- Beautiful terminal UI using Textual
You can run the agent with the following command. Remember to set API key for the selected model in the .env file.
index run
Output will look like this:
Loaded existing browser state
╭───────────────────── Interactive Mode ─────────────────────╮
│ Index Browser Agent Interactive Mode │
│ Type your message and press Enter. The agent will respond. │
│ Press Ctrl+C to exit. │
╰────────────────────────────────────────────────────────────╯
Choose an LLM model:
1. Claude 3.7 Sonnet (default)
2. OpenAI o4-mini
Select model [1/2] (1): 2
Using OpenAI model: o4-mini
Loaded existing browser state
Your message: go to lmnr.ai, summarize pricing page
Agent is working...
Step 1: Opening lmnr.ai
Step 2: Opening Pricing page
Step 3: Scrolling for more pricing details
Step 4: Scrolling back up to view pricing tiers
Step 5: Provided concise summary of the three pricing tiers
Run the agent with code
import asyncio
from index import Agent, AnthropicProvider
async def main():
llm = AnthropicProvider(
model="claude-3-7-sonnet-20250219",
enable_thinking=True,
thinking_token_budget=2048)
# llm = OpenAIProvider(model="o4-mini") you can also use OpenAI models
agent = Agent(llm=llm)
output = await agent.run(
prompt="Navigate to news.ycombinator.com, find a post about AI, and summarize it"
)
print(output.result)
if __name__ == "__main__":
asyncio.run(main())
Stream the agent's output
async for chunk in agent.run_stream(
prompt="Navigate to news.ycombinator.com, find a post about AI, and summarize it"
):
print(chunk)
Enable browser agent observability
To trace Index agent's actions and record browser session you simply need to initialize Laminar tracing before running the agent.
from lmnr import Laminar
Laminar.initialize(project_api_key="your_api_key")
Then you will get full observability on the agent's actions synced with the browser session in the Laminar platform.
Run with remote CDP url
import asyncio
from index import Agent, AnthropicProvider, BrowserConfig
async def main():
# Configure browser to connect to an existing Chrome DevTools Protocol endpoint
browser_config = BrowserConfig(
cdp_url="<cdp_url>"
)
llm = AnthropicProvider(model="claude-3-7-sonnet-20250219", enable_thinking=True, thinking_token_budget=2048)
agent = Agent(llm=llm, browser_config=browser_config)
output = await agent.run(
prompt="Navigate to news.ycombinator.com and find the top story"
)
print(output.result)
if __name__ == "__main__":
asyncio.run(main())
Customize browser window size
import asyncio
from index import Agent, AnthropicProvider, BrowserConfig
async def main():
# Configure browser with custom viewport size
browser_config = BrowserConfig(
viewport_size={"width": 1200, "height": 900}
)
llm = AnthropicProvider(model="claude-3-7-sonnet-20250219")
agent = Agent(llm=llm, browser_config=browser_config)
output = await agent.run(
"Navigate to a responsive website and capture how it looks in full HD resolution"
)
print(output.result)
if __name__ == "__main__":
asyncio.run(main())
Made with ❤️ by the Laminar team
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file lmnr_index-0.1.6.tar.gz.
File metadata
- Download URL: lmnr_index-0.1.6.tar.gz
- Upload date:
- Size: 2.3 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
155a873d375bfce55803350a1ca13184d6c2a0023915052e2440ac58da40e9d9
|
|
| MD5 |
f2c8d9ae180a6d59e99d960d7c5147f5
|
|
| BLAKE2b-256 |
00c0d97cb1d215081ac493293924db097c9007d1d75e9b00eff33dc0539d5ec3
|
Provenance
The following attestation bundles were made for lmnr_index-0.1.6.tar.gz:
Publisher:
publish.yml on lmnr-ai/index
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
lmnr_index-0.1.6.tar.gz -
Subject digest:
155a873d375bfce55803350a1ca13184d6c2a0023915052e2440ac58da40e9d9 - Sigstore transparency entry: 199724526
- Sigstore integration time:
-
Permalink:
lmnr-ai/index@6687a7f483d505a42809d76c5f9a6718514560a0 -
Branch / Tag:
refs/tags/v0.1.6 - Owner: https://github.com/lmnr-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@6687a7f483d505a42809d76c5f9a6718514560a0 -
Trigger Event:
push
-
Statement type:
File details
Details for the file lmnr_index-0.1.6-py3-none-any.whl.
File metadata
- Download URL: lmnr_index-0.1.6-py3-none-any.whl
- Upload date:
- Size: 1.3 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
13415a42b22f1ea8339b375a3f9c133d5cef614a6fb0e7c8bb6d715a86992317
|
|
| MD5 |
f4854f45333bf60af18fd19d9167aac2
|
|
| BLAKE2b-256 |
ec76c2592201435ef8c691e2516ae919561579e335a2fd158f7aa23b4a159b0d
|
Provenance
The following attestation bundles were made for lmnr_index-0.1.6-py3-none-any.whl:
Publisher:
publish.yml on lmnr-ai/index
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
lmnr_index-0.1.6-py3-none-any.whl -
Subject digest:
13415a42b22f1ea8339b375a3f9c133d5cef614a6fb0e7c8bb6d715a86992317 - Sigstore transparency entry: 199724527
- Sigstore integration time:
-
Permalink:
lmnr-ai/index@6687a7f483d505a42809d76c5f9a6718514560a0 -
Branch / Tag:
refs/tags/v0.1.6 - Owner: https://github.com/lmnr-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@6687a7f483d505a42809d76c5f9a6718514560a0 -
Trigger Event:
push
-
Statement type: