Skip to main content

Simplify and improve your job hunting experience by integrating LLMs to automate tasks such as resume and cover letter generation, as well as application submission, saving users time and effort.

Project description

Streamlining Job Applications with LLM Automation Pipeline

Auto Job Apply Pipeline

Project source can be:

All other known bugs and fixes can be sent to following emails with the subject "[BUG] JOB LLM". Reported bugs/fixes will be submitted to correction.

Author & Contributor List

1. Introduction:

1.1. Motivation: LLMs as Components in an ML Pipeline

In this project, we will investigate how to effectively use Large Language Models (LLMs) to automate various aspects of this pipeline.

Because, Solving a task using machine learning methods requires a series of steps that often require large amounts of human effort or labor. Furthermore there might be more steps after the training the ML model, such as evaluation, explaining the behavior of the model, interpreting model outputs, etc. Many of these steps are also often human labor intensive.

1.2. Our Proposal

We're aiming to create a automated system that makes applying for jobs a breeze. Job hunting has many stages, and we see a chance to automate things and use LLM (Language Model) to make it even smoother. We're looking at different ways, both the usual and some new ideas, to integrate LLM into the job application process. The goal is to reduce how much you have to do and let LLM do its thing, making the whole process easier for you.

1.3. Refer Project Report for more details.

2. Setup, Installation and Usage

2.1. Prerequisites

  • OS : Linux, Mac
  • Python : 3.10.12 and above
  • OpenAI API key:
    • Store it in your environment variable called OPENAI_API_KEY.
    • Or you can pass it as class Arugment. like
      from zlm import AutoApplyModel
      job_llm = AutoApplyModel(YOUR_OPENAI_API_KEY_HERE)
      

2.2. Package Installation

pip install zlm
  • Usage
from zlm import AutoApplyModel

job_llm = AutoApplyModel("OPENAI_API_KEY")
job_llm.resume_cv_pipeline("JOB_LINK_YOU_WANT_APPLY_FOR") # Return and downloads curated resume and cover letter.

2.3. Setup as projects

  • Create and activate python environment (use python -m venv .env or conda or etc.) to avoid any package dependency conflict.
  • Install Poetry package (dependency management and packaging tool)
    pip install poetry
    
  • Install all required packages.
    • Refer pyproject.toml or poetry.lock for list of packages.
      poetry install
      
      OR
    • We recommend using poetry, if above command not working, we also provided requirements.txt file.
      pip install -r resources/requirements.txt
      
  • We also need to install following packages to conversion of latex to pdf
    • For linux
      sudo apt-get install texlive-latex-base texlive-fonts-recommended texlive-fonts-extra
      
      NOTE: try sudo apt-get update if terminal unable to locate package.
    • For Mac
      brew install basictex
      sudo tlmgr install enumitem fontawesome
      

2.4. Run Code

python main.py /
    --url "JOB_POSTING_URL" /
    --master_data="JSON_USER_MASTER_DATA" /
    --api_key="YOUR_LLM_PROVIDER_API_KEY" / # put api_key considering provider
    --downloads_dir="DOWNLOAD_LOCATION_FOR_RESUME_CV" /
    --provider="openai" # openai, gemini, together, g4f

3. References

4. Limitation and Further growth :

  • Evaluation of generated resumes: metrics can be
    • Content Preservation: overlap b/w keywords from resume and master data.
    • Goodness of resume for certain job: Overlap b/w generated resume and job description.
  • Streamlit app development
  • When ship as package give option for
    • Passing OPENAI_API_KEY
    • Where to Download folder or path

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zlm-0.1.26.tar.gz (135.2 kB view details)

Uploaded Source

Built Distribution

zlm-0.1.26-py3-none-any.whl (136.3 kB view details)

Uploaded Python 3

File details

Details for the file zlm-0.1.26.tar.gz.

File metadata

  • Download URL: zlm-0.1.26.tar.gz
  • Upload date:
  • Size: 135.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.11.6 Darwin/23.2.0

File hashes

Hashes for zlm-0.1.26.tar.gz
Algorithm Hash digest
SHA256 a6208390321a71503468332d4e8e4628601512b0378cbca7533d39b4e05c70e9
MD5 2c03ede2282dbaad0d1b321788db6081
BLAKE2b-256 0530c1aff2ea03205db9b549f9c4e313c4c422a82450380c20fba8f37b9fb3e0

See more details on using hashes here.

File details

Details for the file zlm-0.1.26-py3-none-any.whl.

File metadata

  • Download URL: zlm-0.1.26-py3-none-any.whl
  • Upload date:
  • Size: 136.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.11.6 Darwin/23.2.0

File hashes

Hashes for zlm-0.1.26-py3-none-any.whl
Algorithm Hash digest
SHA256 265aee100db101c66bce3482a16156c60b4f988fba05568a762a1c7de7cb5a14
MD5 392aa1165817bdc039852370e5280399
BLAKE2b-256 fafbc00a0a13a1db298683d7731bc3493e4c00c6ab593a75a1a4292314c5ca83

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page