Skip to main content

Llm agent to search within a graph

Project description

code-base-agent

Introduction

This repo introduces a method to represent a local code repository as a graph structure. The objective is to allow an LLM to traverse this graph to understand the code logic and flow. Providing the LLM with the power to debug, refactor, and optimize queries. However, several tasks are yet unexplored.

Technology Stack

We used a combination of llama-index, CodeHierarchy module, and tree-sitter-languages for parsing code into a graph structure, Neo4j for storing and querying the graph data, and langchain to create the agents.

Installation

Install the package:

pip install blar-graph

Set the env variables

NEO4J_URI=neo4j+s://YOUR_NEO4J.databases.neo4j.io
NEO4J_USERNAME=neo4j
NEO4J_PASSWORD=YOUR_NEO4J_PASSWORD
OPENAI_API_KEY=YOUR_OPEN_AI_KEY

If you are new to Neo4j you can deploy a free instance of neo4j with Aura. Also you can host your own version in AWS or GCP

Quick start guide

To build the graph, you have to instantiate the graph manager and constructor. The graph manager handles the connection with Neo4j, and the graph constructor processes the directory input to create the graph.

import traceback
import uuid

from blar_graph.db_managers import Neo4jManager
from blar_graph.graph_construction.core.graph_builder import GraphConstructor

repoId = str(uuid.uuid4())
entityId = str(uuid.uuid4())
graph_manager = Neo4jManager(repoId, entityId)

try:
    graph_constructor = GraphConstructor(graph_manager)
    graph_constructor.build_graph("YOUR_LOCAL_DIRECTORY")
    graph_manager.close()
except Exception as e:
    print(e)
    print(traceback.format_exc())
    graph_manager.close()

Now you can use our agent tools, or build your own, to create agents that resolves specific tasks. In the folder 'agents_tools' you will find all our tools (for now is just the Keyword search) and examples of agent implementations. For example, for a debugger agent you could do:

from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain.agents.format_scratchpad.openai_tools import (
    format_to_openai_tool_messages,
)
from langchain.agents.output_parsers.openai_tools import (
    OpenAIToolsAgentOutputParser,
)
from blar_graph.agents_tools.tools.KeywordSearchTool import KeywordSearchTool
from blar_graph.db_managers.base_manager import BaseDBManager
from langchain.agents import AgentExecutor
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(model="gpt-4-turbo-preview", temperature=0)

system_prompt = """
    You are a code debugger, Given a problem description and an initial function, you need to find the bug in the code.
    You are given a graph of code functions,
    We purposly omitted some code If the code has the comment '# Code replaced for brevity. See node_id ..... '.
    You can traverse the graph by calling the function keword_search.
    Prefer calling the function keword_search with query = node_id, only call it with starting nodes or neighbours.
    Explain why your solution solves the bug. Extensivley traverse the graph before giving an answer
"""


prompt = ChatPromptTemplate.from_messages(
    [
        (
            "system",
            system_prompt,
        ),
        ("user", "{input}"),
        MessagesPlaceholder(variable_name="agent_scratchpad"),
    ]
)

tools = [KeywordSearchTool(db_manager=graph_manager)]
llm_with_tools = llm.bind_tools(tools)

agent = (
    {
        "input": lambda x: x["input"],
        "agent_scratchpad": lambda x: format_to_openai_tool_messages(
            x["intermediate_steps"]
        ),
    }
    | prompt
    | llm_with_tools
    | OpenAIToolsAgentOutputParser()
)

Now you can ask your agent to perform a debugging process.

list(
    agent.stream(
        {
            "input": """
            The directory nodes generates multiples connections,
            it doesn't distinguish between different directories, can you fix it?
            The initial functions is run
            """
        }
    )
)

You can find more examples in the folder 'examples'. They are comprehensive jupiter notebooks that guide you from creating the graph to deploying the agent.

Note: The supported languages for now are python, javascript and typescript. We are going to include C and C++ (or other language) if you ask for it enough. So don't hesitate to reach out through the issues or directly to benjamin@blar.io or jose@blar.io

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

blar_graph-1.0.3.tar.gz (22.8 kB view details)

Uploaded Source

Built Distribution

blar_graph-1.0.3-py3-none-any.whl (29.9 kB view details)

Uploaded Python 3

File details

Details for the file blar_graph-1.0.3.tar.gz.

File metadata

  • Download URL: blar_graph-1.0.3.tar.gz
  • Upload date:
  • Size: 22.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.2 CPython/3.10.12 Linux/6.5.0-41-generic

File hashes

Hashes for blar_graph-1.0.3.tar.gz
Algorithm Hash digest
SHA256 4e06225afb4d357df88c45c9cd29910e4edb74db20840bd206e690fc32200c7f
MD5 126f3bcf8ca7ce8d0191e2000a7bab53
BLAKE2b-256 2035910d5e62ebcca5852090c0705cdb71da84159905cee6a20f4afbebfa398a

See more details on using hashes here.

File details

Details for the file blar_graph-1.0.3-py3-none-any.whl.

File metadata

  • Download URL: blar_graph-1.0.3-py3-none-any.whl
  • Upload date:
  • Size: 29.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.2 CPython/3.10.12 Linux/6.5.0-41-generic

File hashes

Hashes for blar_graph-1.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 5e040e6b818db13a0b606b1401ac2322804249f77ae4c0273daf6d565085a290
MD5 6aea44459c5e6412e6fab1c20f3a9bbc
BLAKE2b-256 c1cb80c560102ad3bce89008b3c9a9bd599061fa72be533a5e292c768986024b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page