Skip to main content

Simplify and improve your job hunting experience by integrating LLMs to automate tasks such as resume and cover letter generation, as well as application submission, saving users time and effort.

Project description

ResumeFlow: An LLM-facilitated Pipeline for Personalized Resume Generation and Refinement

Auto Job Aligned Personalized Resume Generation Pipeline Auto Job Aligned Personalized Resume Generation Pipeline

Project source can be:

All other known bugs and fixes can be sent to following emails with the subject "[BUG] JOB LLM". Reported bugs/fixes will be submitted to correction.

Author & Contributor List

1. Introduction:

1.1. Motivation: LLMs as Components in an ML Pipeline

In this project, we will investigate how to effectively use Large Language Models (LLMs) to automate various aspects of this pipeline.

Because, Solving a task using machine learning methods requires a series of steps that often require large amounts of human effort or labor. Furthermore there might be more steps after the training the ML model, such as evaluation, explaining the behavior of the model, interpreting model outputs, etc. Many of these steps are also often human labor intensive.

1.2. Our Proposal

We're aiming to create a automated system that makes applying for jobs a breeze. Job hunting has many stages, and we see a chance to automate things and use LLM (Language Model) to make it even smoother. We're looking at different ways, both the usual and some new ideas, to integrate LLM into the job application process. The goal is to reduce how much you have to do and let LLM do its thing, making the whole process easier for you.

1.3. Refer Project Report for more details.

2. Setup, Installation and Usage

2.1. Prerequisites

2.2. Package Installation - Use as Library

pip install zlm
  • Usage
from zlm import AutoApplyModel

job_llm = AutoApplyModel(
    api_key="PROVIDE_API_KEY", 
    provider="ENTER PROVIDER <gemini> or <openai>",
    downloads_dir="[optional] ENTER FOLDER PATH WHERE FILE GET DOWNLOADED, By default, 'downloads' folder"
)

job_llm.resume_cv_pipeline(
    "ENTER_JOB_URL", 
    "YOUR_MASTER_RESUME_DATA" # .pdf or .json
) # Return and downloads curated resume and cover letter.

2.4. Setup & Run Code - Use as Project

git clone https://github.com/Ztrimus/job-llm.git
cd job-llm
  1. Create and activate python environment (use python -m venv .env or conda or etc.) to avoid any package dependency conflict.
  2. Install Poetry package (dependency management and packaging tool)
    pip install poetry
    
  3. Install all required packages.
    • Refer pyproject.toml or poetry.lock for list of packages.
      poetry install
      
      OR
    • If above command not working, we also provided requirements.txt file. But, we recommend using poetry.
      pip install -r resources/requirements.txt
      
  4. We also need to install following packages to conversion of latex to pdf
    • For linux
      sudo apt-get install texlive-latex-base texlive-fonts-recommended texlive-fonts-extra
      
      NOTE: try sudo apt-get update if terminal unable to locate package.
    • For Mac
      brew install basictex
      sudo tlmgr install enumitem fontawesome
      
  5. Run following script to get result
>>> python main.py /
    --url "JOB_POSTING_URL" /
    --master_data="JSON_USER_MASTER_DATA" /
    --api_key="YOUR_LLM_PROVIDER_API_KEY" / # put api_key considering provider
    --downloads_dir="DOWNLOAD_LOCATION_FOR_RESUME_CV" /
    --provider="openai" # openai, gemini, together, g4f

3. References

4. Limitation and Further growth :

  • Evaluation of generated resumes: metrics can be
    • Content Preservation: overlap b/w keywords from resume and master data.
    • Goodness of resume for certain job: Overlap b/w generated resume and job description.
  • Streamlit app development
  • When ship as package give option for
    • Passing OPENAI_API_KEY
    • Where to Download folder or path

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zlm-0.1.29.tar.gz (141.0 kB view details)

Uploaded Source

Built Distribution

zlm-0.1.29-py3-none-any.whl (146.0 kB view details)

Uploaded Python 3

File details

Details for the file zlm-0.1.29.tar.gz.

File metadata

  • Download URL: zlm-0.1.29.tar.gz
  • Upload date:
  • Size: 141.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.12.1 Darwin/23.2.0

File hashes

Hashes for zlm-0.1.29.tar.gz
Algorithm Hash digest
SHA256 1e733e9c08db27af011efdf49e9bdac13c6ef29764012d583d295cef6c20284a
MD5 600a829c6e4e821e4cdc432e25a3ca6b
BLAKE2b-256 5ad863f952beba38bca6422f693bfaf9ccb6e678df07459fb73a99059888ccaa

See more details on using hashes here.

File details

Details for the file zlm-0.1.29-py3-none-any.whl.

File metadata

  • Download URL: zlm-0.1.29-py3-none-any.whl
  • Upload date:
  • Size: 146.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.12.1 Darwin/23.2.0

File hashes

Hashes for zlm-0.1.29-py3-none-any.whl
Algorithm Hash digest
SHA256 0bb4216297fa047937361ce9427b412060281165df0c4a9aa684ecfa8b9d5632
MD5 f859da4a3262cc19500f42120a56ee0f
BLAKE2b-256 1334b0cf51a5940143af5685672458b6d04a3684f204077ff9da9f37373d9e14

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page