Skip to main content

The First AI-Based Medical Agent Designed to Automate All Medical Processes.

Project description

LUCI

Luci is the first AI-based medical agent designed to automate a wide range of healthcare processes. Luci also facilitates advanced tasks like generating master prompts that interconnect and enabling custom search queries for text or images, all with simple Python calls. It helps professionals handle everything from generating SOAP notes to performing agentic searches, assisting with GPT queries, and even creating personalized AI models.

Installation

pip install LuciAI

Usage

After installing luci - go to your terminal and type luci_cli

Ask Any Query

luci-cli cerina "Your Query?"

alt text

Agentic Search

luci-cli search "COVID-19 symptoms" --type text --async-mode --max-results 5

alt text

You can switch between text and image searches, and toggle asynchronous mode by adding --async-mode.

  • Change --type to text or image
  • If you choose asynchronous mode then add it --async-mode
  • If you want to use synchronous then remove it

Models

Choose your model and find their respective methods!

FUNCTION NAME METHOD TYPE
TOGETHER call_together ASYNC
OPENAI call_gpt SYNC
OPENAI call_async_gpt ASYNC
AZURE OPENAI call_azure ASYNC
AZURE OPENAI call_azure_sync SYNC
from Luci.Models.model import ChatModel

# Initialize the ChatModel class
chat_model = ChatModel(
    model_name="meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo",  # Specify your model name
    api_key="",  # Your OpenAI API key
    prompt="What is the weather today?"
    #SysPrompt = "You can mention your system prompt here"
)

# Call the method that generates a response
result = chat_model.call_together()

# Print the result
print(result)

Searching any Query

from Luci.Agents import Search

query = "Revmaxx LLC"
search_instance = Search(query)

results = search_instance.search_text(max_results=5)
search_instance.print_text_result(results)

Get the JSON Response of the Search Query

from Luci import search_text

query="Recent trends of AI?"

print(search_text(query, max_results=3))

Search any Image

from Luci import Search

query = "Your Query ?"
search_instance = Search(query)

results = search_instance.search_images(max_results=5)
search_instance.print_text_result(results)

Generate SOAP Notes through CLI

luci-cli soap --model "your_model" --api-key "your_api_key" --subjective "..." --objective "..." --assessment "..." --plan "..." --master-prompt "..." --connected

Generate SOAP Notes

from Luci.Agents.soap import *

model_name = "" # Mention the model name
api_key = # or pass directly from user input

t = "Fetch transcript from a audio of doctor & patient conversations"   
S = f"""Generate a subjective from the transcript {t} by mentioning the heading ## Subjective under sub headings **Chief Complaints**, **HPI** """
O = f"Find the objectives from the transcript {t} by mentioning the heading ## Objective under sub headings **Vitals** and **History**"
A = f"Find the assessment from the transcript {t} by mentioning the heading ## Assessment under a list of valid assessments"
P = f"Generate plan from the transcript {t} by mentioning the heading ## Plan under a list of valid plans"
M = "Generate a detailed SOAP note based on the inputs. Follow headings and sub-headings carefully"

    # Create an instance of the SoapAgent class
soap_agent = SoapAgent(model_name=model_name, api_key=api_key, connected=True)
soap_agent.create_soap(S, O, A, P, M)

Voice Documentation

Voice dictation for clinical notes, transcribing speech to text, and structuring it into standardized formats like SOAP notes.

from Luci.Agents.voice_documentation_agent import VoiceDocumentationAgent

agent = VoiceDocumentationAgent(
    model_name='',  # Replace with your model name
    api_key='',   # Replace with your Together API key
    method='call_together' #mention here method
)
agent.run()

CLI Example

luci-cli voice_doc --model-name <MODEL_NAME> --api-key <API_KEY> [--method <METHOD>] [--stop-word <STOP_WORD>]

Medical Search Agent

Search the new papers on pubmed through luci!

from Luci.Agents.medica_search_agent import MedicalSearchAgent

email = "wbavishek@gmail.com"
max_results = 5
query = "diabetes treatment guidelines"

agent = MedicalSearchAgent(email=email, max_results=max_results)
articles = agent.search(query)
agent.print_results(articles)

CLI Example

luci-cli medsearch "diabetes treatment guidelines" --email "wbavishekbhattacharjee@gmail.com"
 --max-results 2

Run Agents

from Luci import Agent, SearchTool
import os  # Import os to set environment variables

# User-defined parameters
query = "diabetes treatment guidelines"  # Final Query
model = "meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo"  # Choose a model name
method = "call_together"  # Choose a ChatModel method (corrected)
email = "wbavishek@gmail.com"

# Set the API_KEY environment variable
os.environ['API_KEY'] = ""

# Make a Research Agent
research_agent = Agent.built(
    name="ResearchAgent",
    objective="Gather the latest research on diabetes treatment guidelines.",
    task=query,  # The research task/query
    precautions="Do not hallucinate information; only use reputable medical journals and sources.",
    tool=SearchTool(email=email)
)

# Make a Writer Agent
writer = Agent.built(
    name="WriterAgent",
    objective="Compose a comprehensive summary based on the research findings.",
    task="Summarize the diabetes treatment guidelines based on the provided research.",
    precautions="Maintain medical accuracy and clarity.",
    tool=None  # No specific tool needed for writing
)

# Connect the Writer Agent to the Research Agent
research_agent.connect_agent('writer_agent', writer)

# Generate the final answer using the connected Writer Agent 
# This uses the Research Agent's task as the query for the Writer Agent
final_answer = research_agent.generate_final_answer(model, method, query)

print("Final Answer:")
print(final_answer)

CLI Example to build Optimized Agent YAML

luci-cli refined_agent --name "Research" --objective "Building open source AI Medical Agent Framework how it can help my company focused on ai medical scribe" --precautions "Avoid using outdated information" --task "AI Medical Agent framework experience" 

Change the value of --name , --objective, --precautions, --task.

ClI Example for Run Agents Automatically from YAML

luci-cli run_agent --yaml-path "agents_configs/Research_research.yaml" --model "meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo" --method "call_together"

Change --yaml-path to your yaml path and -- model to your model name and --method to your method name.

Chain of Thoughts

import asyncio
from Luci import create_master_prompt

async def main():
    # Define API key and model
    api_key = ""
    model_name = "cerina"  # Could be "together" or "cerina"

    # Define your list of prompts
    prompts = [
        "Generate a summary of the latest AI research.",
        "Explain the benefits of using AI in healthcare."
    ]

    # Generate master prompt using the standalone function
    master_prompt = await create_master_prompt(
        api_key=api_key,
        model_name=model_name,
        prompts=prompts,
        connected=True,  # Set to True to enable passing info between prompts
        search=True  # Set to True to enable search functionality
    )

    print("Generated Master Prompt:", master_prompt)

# Run the async main function
if __name__ == "__main__":
    asyncio.run(main())

Contributing

We welcome contributions to LuciAI! If you're interested in improving the project, please follow these steps:

  1. Fork the repository.
  2. Create a new branch for your feature or bug fix.
  3. Make your changes.
  4. Create a pull request.

License

License: RevMaxx LLC Software License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

luciai-0.3.0.tar.gz (19.2 kB view details)

Uploaded Source

Built Distribution

LuciAI-0.3.0-py3-none-any.whl (25.6 kB view details)

Uploaded Python 3

File details

Details for the file luciai-0.3.0.tar.gz.

File metadata

  • Download URL: luciai-0.3.0.tar.gz
  • Upload date:
  • Size: 19.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.8

File hashes

Hashes for luciai-0.3.0.tar.gz
Algorithm Hash digest
SHA256 fcafc06f5eb69131b3963c6c4fc05415912c3cd09acc47620945c0039c85df48
MD5 6a72e511cb051968cf5f3d492a2aca19
BLAKE2b-256 4b5dfff97cc9c5fc1de382bdcdc135f7a69bdf598b4866e2f81f1e1c812d30c5

See more details on using hashes here.

File details

Details for the file LuciAI-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: LuciAI-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 25.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.8

File hashes

Hashes for LuciAI-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1cc18eca3faffae89bf1cafc8556cfb862242a242809dce979b362a930130442
MD5 baef91fd88f7cc354c1d06b849751fe9
BLAKE2b-256 ac5ecb7e6301bd854434c40db44d917ecf1883ef524bf3b31f04c26700acb39d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page