Skip to main content

Index - SOTA browser AI agent for autonomous task execution on the web

Project description

Index

Index is a state-of-the-art browser agent that uses VLMs (vision-language models) to autonomously execute complex tasks on the web. Available as a Python package and as a hosted API.

Index API

Index API is available on Laminar. Index API manages remote browser sessions and agent infrastructure. Index API is the best way to run AI browser automation in production. To get started, sign up and create project API key.

Install Laminar

pip install lmnr

Use Index API

from lmnr import Laminar, AsyncLaminarClient
# you can also set LMNR_PROJECT_API_KEY environment variable

# Initialize tracing
Laminar.initialize(project_api_key="your_api_key")

# Initialize the client
client = AsyncLaminarClient(api_key="your_api_key")

async def main():

    # Run a task
    response = await client.agent.run(
        prompt="Navigate to news.ycombinator.com, find a post about AI, and summarize it"
    )

    # Print the result
    print(response.result)
    
if __name__ == "__main__":
    asyncio.run(main())

Local Quick Start

Install dependencies

pip install lmnr-index

# Install playwright
playwright install chromium

Run the agent

import asyncio
from lmnr_index import Agent, AnthropicProvider

async def main():
    # Initialize the LLM provider
    llm = AnthropicProvider(
            model="claude-3-7-sonnet-20250219",
            enable_thinking=True, 
            thinking_token_budget=2048)
    
    # Create an agent with the LLM
    agent = Agent(llm=llm)
    
    # Run the agent with a task
    output = await agent.run(
        "Navigate to news.ycombinator.com, find a post about AI, and summarize it"
    )
    
    # Print the result
    print(output.result)
    
if __name__ == "__main__":
    asyncio.run(main())

Stream the agent's output

from lmnr_index import Agent, AnthropicProvider

agent = Agent(llm=AnthropicProvider(model="claude-3-7-sonnet-20250219"))    

# Stream the agent's output
async for chunk in agent.run_stream(
    prompt="Navigate to news.ycombinator.com, find a post about AI, and summarize it"):
    print(chunk)

Run with remote CDP url

import asyncio
from lmnr_index import Agent, AnthropicProvider, Browser, BrowserConfig

async def main():
    # Configure browser to connect to an existing Chrome DevTools Protocol endpoint
    browser_config = BrowserConfig(
        cdp_url="ws://localhost:9222/devtools/browser/[session-id]"
    )
    
    # Create browser with the config
    browser = Browser(config=browser_config)
    
    # Initialize the LLM provider
    llm = AnthropicProvider(model="claude-3-7-sonnet-20250219")
    
    # Create an agent with the LLM and browser
    agent = Agent(llm=llm, browser=browser)
    
    # Run the agent with a task
    output = await agent.run(
        "Navigate to news.ycombinator.com and find the top story"
    )
    
    # Print the result
    print(output.result)
    
if __name__ == "__main__":
    asyncio.run(main())

Customize browser window size

import asyncio
from lmnr_index import Agent, AnthropicProvider, Browser, BrowserConfig

async def main():
    # Configure browser with custom viewport size
    browser_config = BrowserConfig(
        viewport_size={"width": 1920, "height": 1080}  # Full HD resolution
    )
    
    # Create browser with the config
    browser = Browser(config=browser_config)
    
    # Initialize the LLM provider
    llm = AnthropicProvider(model="claude-3-7-sonnet-20250219")
    
    # Create an agent with the LLM and browser
    agent = Agent(llm=llm, browser=browser)
    
    # Run the agent with a task
    output = await agent.run(
        "Navigate to a responsive website and capture how it looks in full HD resolution"
    )
    
    # Print the result
    print(output.result)
    
if __name__ == "__main__":
    asyncio.run(main())

Made with ❤️ by the Laminar team

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lmnr_index-0.0.1.tar.gz (1.3 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lmnr_index-0.0.1-py3-none-any.whl (1.3 MB view details)

Uploaded Python 3

File details

Details for the file lmnr_index-0.0.1.tar.gz.

File metadata

  • Download URL: lmnr_index-0.0.1.tar.gz
  • Upload date:
  • Size: 1.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for lmnr_index-0.0.1.tar.gz
Algorithm Hash digest
SHA256 a05f3462758252f9724ac46917a716a1b1d06d568f4c34d7eeebd1c6935ba250
MD5 ed7389e7901d4aabb40616f20565f3f4
BLAKE2b-256 0d70fa08fc115f789ec6ecc557106ff71e4b7a3f503941b9c86ef724759b4cf1

See more details on using hashes here.

Provenance

The following attestation bundles were made for lmnr_index-0.0.1.tar.gz:

Publisher: publish.yml on lmnr-ai/index

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file lmnr_index-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: lmnr_index-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 1.3 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for lmnr_index-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 96fa8b4f98fea321f4f02aea41b822cff898ff5a46a4a9533bd3662696e0dd0f
MD5 ba54279e2eeaa6832947915520e3e00f
BLAKE2b-256 255212fbab4bb2eb3aad19e459b8bb0b0472509de76390bdd315f511d012330d

See more details on using hashes here.

Provenance

The following attestation bundles were made for lmnr_index-0.0.1-py3-none-any.whl:

Publisher: publish.yml on lmnr-ai/index

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page