Skip to main content

Client for joinly: Make your meetings accessible to AI Agents

Project description

joinly-client: Client for a conversational meeting agent used with joinly

Prerequisites

Set LLM API key

Export a valid API key for the LLM provider you want to use, e.g. OpenAI:

export OPENAI_API_KEY="sk-..."

Or, create a .env file in the current directory with the following content:

OPENAI_API_KEY="sk-..."

For other providers, export the corresponding environment variable(s) and set provider and model with the command:

uvx joinly-client --llm-provider <provider> --llm-model <model> <MeetingUrl>

Start joinly server

Make sure you have a running joinly server. You can start it with:

docker run -p 8000:8000 ghcr.io/joinly-ai/joinly:latest

For more details on joinly, see the GitHub repository: joinly-ai/joinly.

Command line usage

We recommend using uv for running the client, you can install it using the command in their repository.

Connect to a running joinly server and join a meeting, here loading environment variables from a .env file:

uvx joinly-client --joinly-url http://localhost:8000/mcp/ --env-file .env <MeetingUrl>

Add other MCP servers using a configuration file:

{
    "mcpServers": {
        "localServer": {
            "command": "npx",
            "args": ["-y", "package@0.1.0"]
        },
        "remoteServer": {
            "url": "http://mcp.example.com",
            "auth": "oauth"
        }
    }
}
uvx joinly-client --mcp-config config.json <MeetingUrl>

You can also set other session-specific settings for the joinly server, e.g.:

uvx joinly-client --tts elevenlabs --tts-arg voice_id=EXAVITQu4vr4xnSDxMa6 --lang de <MeetingUrl>

For a full list of command line options, run:

uvx joinly-client --help

Code usage

Direct use of run function:

import asyncio

from dotenv import load_dotenv
from joinly_client import run

load_dotenv()


async def async_run():
    await run(
        joinly_url="http://localhost:8000/mcp/",
        meeting_url="<MeetingUrl>",
        llm_provider="openai",
        llm_model="gpt-4o-mini",
        prompt="You are joinly, a...",
        name="joinly",
        name_trigger=False,
        mcp_config=None,  # MCP servers configuration (dict)
        settings=None,  # settings propagated to joinly server (dict)
    )


if __name__ == "__main__":
    asyncio.run(async_run())

Or only using the client and a custom agent:

import asyncio

from joinly_client import JoinlyClient
from joinly_client.types import TranscriptSegment


async def run():
    client = JoinlyClient(
        url="http://localhost:8000/mcp/",
        name="joinly",
        name_trigger=False,
        settings=None,
    )

    async def on_utterance(segments: list[TranscriptSegment]) -> None:
        for segment in segments:
            print(f"Received utterance: {segment.text}")
            if "marco" in segment.text.lower():
                await client.speak_text("Polo!")

    client.add_utterance_callback(on_utterance)

    async with client:
        # optionally, load all tools from the server
        # can be used to give all tools to the llm
        # e.g., for langchain mcp adapter, use the client.client.session
        tool_list = await client.client.list_tools()

        await client.join_meeting("<MeetingUrl>")
        try:
            await asyncio.Event().wait()  # wait until cancelled
        finally:
            print(await client.get_transcript())  # print the final transcript


if __name__ == "__main__":
    asyncio.run(run())

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

joinly_client-0.1.13.tar.gz (16.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

joinly_client-0.1.13-py3-none-any.whl (18.5 kB view details)

Uploaded Python 3

File details

Details for the file joinly_client-0.1.13.tar.gz.

File metadata

  • Download URL: joinly_client-0.1.13.tar.gz
  • Upload date:
  • Size: 16.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.8.13

File hashes

Hashes for joinly_client-0.1.13.tar.gz
Algorithm Hash digest
SHA256 f5cbf34a121d1d0bc53fdb22426a53275a6f02f9bfa83ece379598a4c2af5ad5
MD5 f6c876b1e32b1e4d25b3e4ee91ea02e0
BLAKE2b-256 c1490f5a75a7850c5c2de935e4f80a7c9a04aa8d0f1c38addf025e49cc707edc

See more details on using hashes here.

File details

Details for the file joinly_client-0.1.13-py3-none-any.whl.

File metadata

File hashes

Hashes for joinly_client-0.1.13-py3-none-any.whl
Algorithm Hash digest
SHA256 93217c30b8dbc5258b9987f65454205dae875ac598e9bd988692fe42b78fb088
MD5 f67b57eeda9b5b764521676b0656c5ea
BLAKE2b-256 9630f83cdaeb4c1961ce7025a777df51d046e30f8c20a8a69c9c1b5d496d6442

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page