Skip to main content

A library for managing agents in Gen AI applications.

Project description

GLLM Agents

Description

A library for managing agents in Generative AI applications.

Installation

Prerequisites

1. Installation from Artifact Registry

Choose one of the following methods to install the package:

Using pip

pip install gllm-agents-binary

Using Poetry

poetry add gllm-agents-binary

2. Development Installation (Git)

For development purposes, you can install directly from the Git repository:

poetry add "git+ssh://git@github.com/GDP-ADMIN/gen-ai-internal.git#subdirectory=libs/gllm-agents"

Managing Dependencies

  1. Go to root folder of gllm-agents module, e.g. cd libs/gllm-agents.
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-agents requirements for the first time.
  5. Run poetry update if you update any dependency module version at pyproject.toml.

Contributing

Please refer to this Python Style Guide to get information about code style, documentation standard, and SCA that you need to use when contributing to this project

  1. Activate pre-commit hooks using pre-commit install
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-agents requirements for the first time.
  5. Run which python to get the path to be referenced at Visual Studio Code interpreter path (Ctrl+Shift+P or Cmd+Shift+P)
  6. Try running the unit test to see if it's working:
poetry run pytest -s tests/unit_tests/

Hello World Examples

Prerequisites

  • Python 3.13+
  • Install the binary package:
pip install gllm-agents-binary
  • For OpenAI: Set your API key in the environment:
export OPENAI_API_KEY=your-openai-key
  • For Google ADK: Set your API key in the environment:
export GOOGLE_API_KEY=your-google-api-key

Run the Hello World Examples

The example scripts are located in the gllm_agents/examples directory within the library. You can run them individually or use the run_all_examples.py script.

1. Running Individual Examples:

Navigate to the library's root directory (e.g., libs/gllm-agents if you cloned the repository).

LangGraph (OpenAI):

python gllm_agents/examples/hello_world_langgraph.py

LangGraph with BOSA Connector (OpenAI):

python gllm_agents/examples/hello_world_langgraph_bosa_twitter.py

LangGraph Streaming (OpenAI):

python gllm_agents/examples/hello_world_langgraph_stream.py

LangGraph Multi-Agent Coordinator (OpenAI):

python gllm_agents/examples/hello_world_a2a_multi_agent_coordinator_server.py

Google ADK:

python gllm_agents/examples/hello_world_google_adk.py

Google ADK Streaming:

python gllm_agents/examples/hello_world_google_adk_stream.py

LangChain (OpenAI):

python gllm_agents/examples/hello_world_langchain.py

LangChain Streaming (OpenAI):

python gllm_agents/examples/hello_world_langchain_stream.py

2. Running MCP Examples

Prerequisites

Ensure you have set the environment variables for API keys:

export OPENAI_API_KEY="your-openai-key"
export GOOGLE_API_KEY="your-google-api-key"

For examples that use stateful MCP tools like browser automation, start the Playwright MCP server in a separate terminal:

npx @playwright/mcp@latest --headless --port 8931

Note: Use the --headless flag to run the server without a visible browser window, which is recommended if the browser is not installed yet to avoid failures. For using an actual (non-headless) browser, refer to the Playwright MCP documentation.

Local MCP Servers

For STDIO, SSE, and HTTP transports using local servers, open a terminal in the library root (libs/gllm-agents) and run:

  • For STDIO:
poetry run python gllm_agents/examples/mcp_servers/mcp_server_stdio.py
  • For SSE:
poetry run python gllm_agents/examples/mcp_servers/mcp_server_sse.py
  • For HTTP:
poetry run python gllm_agents/examples/mcp_servers/mcp_server_http.py

Note: Start the appropriate server before running the client examples for that transport.

Running Examples

All examples are run from the library root using poetry run python gllm_agents/examples/<file>.py. Examples support OpenAI for LangGraph/LangChain and Google ADK where specified.

LangChain Examples

STDIO Transport
  • Non-Streaming:
poetry run python gllm_agents/examples/hello_world_langchain_mcp_stdio.py
  • Streaming:
poetry run python gllm_agents/examples/hello_world_langchain_mcp_stdio_stream.py
SSE Transport
  • Non-Streaming:
poetry run python gllm_agents/examples/hello_world_langchain_mcp_sse.py
  • Streaming:
poetry run python gllm_agents/examples/hello_world_langchain_mcp_sse_stream.py
HTTP Transport
  • Non-Streaming:
poetry run python gllm_agents/examples/hello_world_langchain_mcp_http.py
  • Streaming:
poetry run python gllm_agents/examples/hello_world_langchain_mcp_http_stream.py

Google ADK Examples

STDIO Transport
  • Non-Streaming:
poetry run python gllm_agents/examples/hello_world_google_adk_mcp_stdio.py
  • Streaming:
poetry run python gllm_agents/examples/hello_world_google_adk_mcp_stdio_stream.py
SSE Transport
  • Non-Streaming:
poetry run python gllm_agents/examples/hello_world_google_adk_mcp_sse.py
  • Streaming:
poetry run python gllm_agents/examples/hello_world_google_adk_mcp_sse_stream.py
HTTP Transport
  • Non-Streaming:
poetry run python gllm_agents/examples/hello_world_google_adk_mcp_http.py
  • Streaming:
poetry run python gllm_agents/examples/hello_world_google_adk_mcp_http_stream.py

LangGraph Examples (OpenAI)

STDIO Transport
  • Non-Streaming:
poetry run python gllm_agents/examples/hello_world_langgraph_mcp_stdio.py
  • Streaming:
poetry run python gllm_agents/examples/hello_world_langgraph_mcp_stdio_stream.py
SSE Transport
  • Non-Streaming:
poetry run python gllm_agents/examples/hello_world_langgraph_mcp_sse.py
  • Streaming:
poetry run python gllm_agents/examples/hello_world_langgraph_mcp_sse_stream.py
HTTP Transport
  • Non-Streaming:
poetry run python gllm_agents/examples/hello_world_langgraph_mcp_http.py
  • Streaming:
poetry run python gllm_agents/examples/hello_world_langgraph_mcp_http_stream.py

Multi-Server Example

This LangChain example uses multiple MCP servers: Playwright (for browser actions) and a random name generator (SSE transport) with persistent sessions across multiple arun calls.

  1. Start the Playwright server:
npx @playwright/mcp@latest --headless --port 8931
  1. In another terminal, start the Name Generator SSE server:
poetry run python gllm_agents/examples/mcp_servers/mcp_name.py
  1. Run the multi-server client example:
poetry run python gllm_agents/examples/hello_world_langchain_mcp_multi_server.py

3. Running Individual A2A Examples:

  • Navigate to the library's root directory (e.g., libs/gllm-agents if you cloned the repository).
  • Open a new terminal and navigate to the gllm_agents/examples directory to run the A2A server.

LangChain Server:

python hello_world_a2a_langchain_server.py
  • Open a new terminal and navigate to the gllm_agents/examples directory to run the A2A client.

LangChain Client:

python hello_world_a2a_langchain_client.py

LangChain Client Integrated with Agent Workflow:

python hello_world_a2a_langchain_client_agent.py

LangChain Client Streaming:

python hello_world_a2a_langchain_client_stream.py

Architectural Notes

Agent Interface (AgentInterface)

The gllm_agents.agent.interface.AgentInterface class defines a standardized contract for all agent implementations within the GLLM Agents ecosystem. It ensures that different agent types (e.g., LangGraph-based, Google ADK-based) expose a consistent set of methods for core operations.

Key methods defined by AgentInterface typically include:

  • arun(): For asynchronous execution of the agent that returns a final consolidated response.
  • arun_stream(): For asynchronous execution that streams back partial responses or events from the agent.

By adhering to this interface, users can interact with various agents in a uniform way, making it easier to switch between or combine different agent technologies.

Inversion of Control (IoC) / Dependency Injection (DI)

The agent implementations (e.g., LangGraphAgent, GoogleADKAgent) utilize Dependency Injection. For instance, LangGraphAgent accepts an agent_executor (like one created by LangGraph's create_react_agent) in its constructor. Similarly, GoogleADKAgent accepts a native adk_native_agent. This allows the core execution logic to be provided externally, promoting flexibility and decoupling the agent wrapper from the specific instantiation details of its underlying engine.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

gllm_agents_binary-0.4.16.post1-cp313-cp313-manylinux_2_31_x86_64.whl (3.2 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.31+ x86-64

gllm_agents_binary-0.4.16.post1-cp313-cp313-macosx_13_0_arm64.macosx_15_0_arm64.whl (2.5 MB view details)

Uploaded CPython 3.13macOS 13.0+ ARM64macOS 15.0+ ARM64

gllm_agents_binary-0.4.16.post1-cp312-cp312-win_amd64.whl (2.2 MB view details)

Uploaded CPython 3.12Windows x86-64

gllm_agents_binary-0.4.16.post1-cp312-cp312-manylinux_2_31_x86_64.whl (3.2 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.31+ x86-64

gllm_agents_binary-0.4.16.post1-cp312-cp312-macosx_13_0_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.12macOS 13.0+ x86-64

gllm_agents_binary-0.4.16.post1-cp311-cp311-manylinux_2_31_x86_64.whl (2.9 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.31+ x86-64

File details

Details for the file gllm_agents_binary-0.4.16.post1-cp313-cp313-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_agents_binary-0.4.16.post1-cp313-cp313-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 012c7d86c6afdebc61d86017777688e911df903951325f4ca5acb4c3f134abda
MD5 472cf7e99f212e39ece46adf4b2d7720
BLAKE2b-256 6202576b3a4f81531ea8fdbf19a2d46c07b598ec08e708e23e7391907e5415d0

See more details on using hashes here.

File details

Details for the file gllm_agents_binary-0.4.16.post1-cp313-cp313-macosx_13_0_arm64.macosx_15_0_arm64.whl.

File metadata

File hashes

Hashes for gllm_agents_binary-0.4.16.post1-cp313-cp313-macosx_13_0_arm64.macosx_15_0_arm64.whl
Algorithm Hash digest
SHA256 15754a73ac136feb8d83519c20066551df3324dbaac7184db16ef7beeb9876cd
MD5 6caac6b9e80c4f2fad3d07fa9a3863b4
BLAKE2b-256 eae8e816283caa72df0c053423b93c6f63fe8824ce2ce460430191870755ba03

See more details on using hashes here.

Provenance

The following attestation bundles were made for gllm_agents_binary-0.4.16.post1-cp313-cp313-macosx_13_0_arm64.macosx_15_0_arm64.whl:

Publisher: build-binary.yml on GDP-ADMIN/gl-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file gllm_agents_binary-0.4.16.post1-cp312-cp312-win_amd64.whl.

File metadata

File hashes

Hashes for gllm_agents_binary-0.4.16.post1-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 5f12b266b5d2e3620c2df2c5985432f234330a418c3d1001972128faa2f45b72
MD5 e1d2fcce1bcda36b0cf95b2d76d3750a
BLAKE2b-256 ae145392303f2138ac32322bbf7d29296e78a66de099ff73bbc241c515cc620c

See more details on using hashes here.

Provenance

The following attestation bundles were made for gllm_agents_binary-0.4.16.post1-cp312-cp312-win_amd64.whl:

Publisher: build-binary.yml on GDP-ADMIN/gl-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file gllm_agents_binary-0.4.16.post1-cp312-cp312-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_agents_binary-0.4.16.post1-cp312-cp312-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 da4995ec7243738d277356837ed39bfa079c7cdfe7cfccfbb2b044900637c81e
MD5 3861402ede73314c5cdec905ae64408a
BLAKE2b-256 144e828f338573663ef365490aa82d06f216203ec3c833ce26b63aae58ac6808

See more details on using hashes here.

File details

Details for the file gllm_agents_binary-0.4.16.post1-cp312-cp312-macosx_13_0_x86_64.whl.

File metadata

File hashes

Hashes for gllm_agents_binary-0.4.16.post1-cp312-cp312-macosx_13_0_x86_64.whl
Algorithm Hash digest
SHA256 d7ad3bba9b89d5464366aa253a8198693e6151d1d1abe4375a0a1491981f59ca
MD5 e2a43e192ac1b58fefd6f268eec3c17a
BLAKE2b-256 4972c5975d11b9b4a6fd2354830ac5166894da45ac68f4ac519b3efb1c76cadf

See more details on using hashes here.

Provenance

The following attestation bundles were made for gllm_agents_binary-0.4.16.post1-cp312-cp312-macosx_13_0_x86_64.whl:

Publisher: build-binary.yml on GDP-ADMIN/gl-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file gllm_agents_binary-0.4.16.post1-cp311-cp311-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_agents_binary-0.4.16.post1-cp311-cp311-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 387ceb48327d98e387a1040e3795d3f0025fea71cb71d442ae4ba183c5a28ad2
MD5 05d11712c0ab669f9cdfa36429c369c2
BLAKE2b-256 e26760cdec98e5f87a875edca9149cc3a6856cf40dcd0ec1de1036279e327d53

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page