Skip to main content

No project description provided

Project description

  1. Install via pip pip install crewai-logging-patch
  2. Add import into your main crewai file - This needs to be above any other crewai imports from logger_patch import apply_monkey_patch
  3. Add the patch method to your code - This needs to be above any other crewai imports apply_monkey_patch()
  4. Add the following directly below your agent instances - You must use your agent names in place of <placeholder_name> <placeholder_name>._logger = crewai.utilities.Logger(verbose_level=<placeholder_name>.verbose)
  5. And the same below your crew instance, again you must use your crew name in place of <placeholder_name> <placeholder_name>._logger = crewai.utilities.Logger(verbose_level=<placeholder_name>.verbose)
  6. Ensure that verbose=True is set in your agent and crew instances
  7. Below is an example crewfile with all of this implemented
import os
from logger_patch import apply_monkey_patch

# Apply the monkey patch
apply_monkey_patch()

# Now use crewai and other imports as usual
from crewai import Agent, Crew, Task, Process
import crewai.utilities

# Setup LM Studio environment variables
os.environ['OPENAI_API_BASE'] = 'http://localhost:1234/v1'
os.environ['OPENAI_API_KEY'] = 'sk-111111111111111111111111111111111111111111111111'
os.environ['OPENAI_MODEL_NAME'] = 'Meta-Llama-3-8B-Instruct-imatrix'

# Create the agent
try:
    researcher = Agent(
        role='Researcher',
        goal='Research the topic',
        backstory='As an expert in the field of {topic}, you will research the topic and provide the necessary information',
        max_iter=3,
        max_rpm=100,
        verbose=True,
        allow_delegation=False,
    )

    # Manually set the logger to ensure it's the patched logger
    researcher._logger = crewai.utilities.Logger(verbose_level=researcher.verbose)

    # Create the task
    research_task = Task(
        description='Research the topic',
        agent=researcher,
        expected_output='5 paragraphs of information on the topic',
        output_file='research_result.txt',
    )


    # Create the crew
    crew = Crew(
        agents=[researcher],
        tasks=[research_task],
        process=Process.sequential,
        verbose=True,
        memory=False,
        cache=False,
        max_rpm=100,
    )

    # Manually set the logger for crew to ensure it's the patched logger
    crew._logger = crewai.utilities.Logger(verbose_level=crew.verbose)

    # Start the crew
    result = crew.kickoff(inputs={'topic': '70s, 80s and 90s Australian rock bands'})

except Exception as e:
    print(f"An error occurred: {e}")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

crewai_logging_patch-0.5.tar.gz (3.3 kB view details)

Uploaded Source

Built Distribution

crewai_logging_patch-0.5-py3-none-any.whl (3.8 kB view details)

Uploaded Python 3

File details

Details for the file crewai_logging_patch-0.5.tar.gz.

File metadata

  • Download URL: crewai_logging_patch-0.5.tar.gz
  • Upload date:
  • Size: 3.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for crewai_logging_patch-0.5.tar.gz
Algorithm Hash digest
SHA256 ebefaee304cf2f6c0d75636c81f4012bf5e6fe59dc44d4bc6d5f45f7b1cc33eb
MD5 198f3ec5f2c3dd1a62bccccc5812d3cc
BLAKE2b-256 4dc49657b2c7fcc74b00e2ec7fcf29cc80308035a697f4f5aa7025b7e3469b4e

See more details on using hashes here.

File details

Details for the file crewai_logging_patch-0.5-py3-none-any.whl.

File metadata

File hashes

Hashes for crewai_logging_patch-0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 be9a37f478ea50613baeb224722008f8c19c6ab1241e6d4ca3ca6d0fbe224224
MD5 ec5f9325dba579f5029032230e371f8b
BLAKE2b-256 4d0649746cc0930efbb9c16673e6195e082800c3ebe0df209383f862135fa41e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page