Skip to main content

A Multi-Agent Library to create Linkedin posts from Blog Posts

Project description

Co-Agent: Multi-Agent Conversational Framework

PyPI version
License: MIT
GitHub

Co-Agent is a sophisticated Multi-Agent Conversational Framework designed to automate the creation of LinkedIn-ready posts from blog content. Leveraging advanced AI technologies such as Google Gemini for natural language processing and multi-agent systems for task delegation, Co-Agent ensures an engaging, professional, and shareable output tailored for social media.


🌟 Key Features

  • Intelligent Blog Scraping
    Extracts meaningful content from blogs using a customizable scraper.

  • AI-Driven Summarization
    Summarizes blogs into concise LinkedIn posts, maintaining professional tone and format.

  • Multi-Agent Collaboration
    Deploys a dynamic agent-based system for iterative content refinement, ensuring quality and coherence.

  • Cross-Platform Support
    Features both a Streamlit-based UI and console-based interface for accessibility and user convenience.

  • Plug-and-Play Architecture
    Easily integrates with pre-trained LLMs like Google Gemini or other APIs for future scalability.

  • Preformatted Outputs
    Produces LinkedIn-ready summaries complete with hashtags, headlines, and call-to-action links.


🚀 Quick Start

Installation

Install Co-Agent from PyPI using the following command:

pip install coagent-framework

Usage

Console Workflow

Here’s a step-by-step example for using Co-Agent via the console:

from co_agent import AssistantAgent, UserProxyAgent, llm_config
from co_agent import scraper

# Setting up the Google Gemini LLM API key
llm_config["api_key"] = "Your_Google_API_Key"

# Initializing the scraper
blog_scraper = scraper.BlogScraper(name="blog_scraper")
blog_scraper.scrape()   

# Initialize agents (Assistant and User Proxy)
print("Multi-Agent Chat started:")

assistant = AssistantAgent(name="assistant", llm_config=llm_config)
user_proxy = UserProxyAgent(name="user_proxy", assistant=assistant)

# Process blog content for LinkedIn
summary = user_proxy.initiate_postmaking_process("blog_1")

Streamlit UI

To use Co-Agent with an interactive interface, visit the Streamlit app here: 👉 https://co-agent.streamlit.app/

Simply enter the blog URL in the provided field, and the app will guide you through the process of generating a LinkedIn-ready post.


🛠️ How It Works

Core Components

  1. Scraper Module
    The scraper fetches content from a user-provided blog URL, stripping unnecessary formatting while retaining essential information.

  2. Multi-Agent System

    • AssistantAgent: Responsible for generating initial summaries from blog content.
    • UserProxyAgent: Reviews and refines the summary based on user feedback and iterative collaboration.
  3. LLM Integration
    Google Gemini, or a similar LLM, is used for understanding context, generating concise summaries, and formatting content.

  4. Database Storage
    Approved summaries are stored in a database for later retrieval, ensuring content reusability.

  5. Formatter
    Formats the final summary into a LinkedIn-ready post, including a headline, body, hashtags, and call-to-action links.


📂 Directory Structure

The following is the structure of the project:

coagent_framework/co_agent/
├── agents.py          # Multi-agent system implementation
├── scraper.py         # Blog scraping functionality
├── database.py        # Database utilities
├── config.py          # Configuration settings
app.py                 # main file
pyproject.toml         # Poetry configuration for dependencies
README.md              # Project documentation

📊 Example Output

Input

Blog URL: Viola-Jones Algorithm Blog

Output

Headline:
Viola-Jones Object Detection: A Revolutionary Leap in Computer Vision

Body:
Remember the early 2000s when real-time object detection felt like science fiction? That all changed thanks to the groundbreaking work of Viola and Jones! Their 2001 algorithm, a marvel of machine learning, used a boosted cascade of simple features (Haar-like features and AdaBoost) to achieve incredibly efficient object detection. This clever approach prioritized relevant features and quickly discarded irrelevant ones, making real-time face detection a reality – a true game-changer!

This algorithm's impact is still felt today. Easily accessible via OpenCV, it continues to serve as a foundational element in many computer vision applications. Want to delve deeper into the magic behind this revolutionary technique?


⚖️ License

This project is licensed under the MIT License. See the LICENSE file on our github for more details.


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

coagent_framework-0.2.2.tar.gz (8.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

coagent_framework-0.2.2-py3-none-any.whl (9.6 kB view details)

Uploaded Python 3

File details

Details for the file coagent_framework-0.2.2.tar.gz.

File metadata

  • Download URL: coagent_framework-0.2.2.tar.gz
  • Upload date:
  • Size: 8.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.4 Windows/10

File hashes

Hashes for coagent_framework-0.2.2.tar.gz
Algorithm Hash digest
SHA256 2b6166fd589e4adde720c5f55ac8302ab0ee3971035ff75476119063eb72333a
MD5 1f1397f7014de7e2209f0bb2d2c1c577
BLAKE2b-256 0e1344072e1d114a3f8da57faed98c35fc4dbfde6eed3da8f87766ba81baf461

See more details on using hashes here.

File details

Details for the file coagent_framework-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: coagent_framework-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 9.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.4 Windows/10

File hashes

Hashes for coagent_framework-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 a9bf785c2005644abbaeccfa31813167a0e491477dcfea0af03eb8c0522e2814
MD5 9bd4de313ffcea4617a3d52eca1e6e42
BLAKE2b-256 5534c69ee94967bd15776d3f61d31f7a83e1ac93ccdac26e1514ed806482068d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page