Skip to main content

A command-line tool to automate job applications and scraping on OnlineJobs.ph

Project description

OnlineJobs.ph CLI

A command-line tool to automate job applications and scraping on OnlineJobs.ph.

Features

  • Login: Authenticate with OnlineJobs.ph and export session cookies
  • Apply: Automatically apply to job postings with custom messages and contact info
  • Jobs: Search and scrape job listings with descriptions
  • Proxy Support: Route requests through HTTP/HTTPS proxies (with optional authentication)

Installation

Requirements

  • Python 3.11+

Option 1: Install from PyPI (Recommended)

pip install olj-cli

This will install the olj-cli command-line tool globally.

Option 2: Clone and Install from Source

git clone https://github.com/Kuugang/olj-cli.git
cd olj-cli
pip install -e .

This will install the package in development mode.

Example Usage

1. Login — Get Session Cookies

Authenticate and save your session cookies for use in other commands.

COOKIES=$(olj-cli login --email you@example.com --password secret)

This prints the cookies as JSON to stdout, which you can store in the COOKIES variable.

2. Apply to a Job

Submit an application to a specific job posting.

olj-cli apply \
  --cookies "$COOKIES" \
  --job-url "https://www.onlinejobs.ph/jobseekers/job/1604447" \
  --subject "Applying for Senior Developer" \
  --message "I would like to apply, thank you." \
  --contact-info "Email: you@example.com | GitHub: yourhandle"

Parameters:

  • --cookies: JSON cookies string from the login command
  • --job-url: Full URL of the job posting
  • --subject: Email subject line
  • --message: Email message body
  • --contact-info: Your contact information
  • --apply-points (optional): Points to spend (default: 1)

3. Scrape Jobs

Search and scrape job listings with full descriptions.

olj-cli jobs --filter "python developer" --pages 3

Parameters:

  • --filter (optional): Keyword filter for search
  • --pages (optional): Number of pages to scrape (if not specified, scrapes until no jobs found)

Output: JSON array of jobs with url, title, posted_by, posted_on, rate, and description

Commands

All commands support the --proxy option for routing through proxies.

login

Authenticate with OnlineJobs.ph and output session cookies as JSON.

olj-cli [--proxy <proxy>] login --email <email> --password <password>

Environment Variables:

  • OLJ_EMAIL: Account email (alternative to --email)
  • OLJ_PASSWORD: Account password (alternative to --password)

apply

Apply to a job posting using authenticated session.

olj-cli [--proxy <proxy>] apply --cookies <JSON> --job-url <url> --subject <subject> --message <message> --contact-info <info>

jobs

Search and scrape job listings.

olj-cli [--proxy <proxy>] jobs [--filter <keyword>] [--pages <number>]

Proxy Configuration

All commands support HTTP/HTTPS proxies. Use the --proxy argument in format:

  • Without authentication: host:port
  • With authentication: host:port:username:password

Examples

# Login through proxy (no auth)
olj-cli --proxy "proxy.example.com:8080" login --email you@example.com --password secret

# Apply to job through authenticated proxy
olj-cli --proxy "proxy.example.com:8080:user:pass" apply \
  --cookies "$COOKIES" \
  --job-url "https://www.onlinejobs.ph/jobseekers/job/1604447" \
  --subject "Applying for Senior Developer" \
  --message "I would like to apply, thank you." \
  --contact-info "Email: you@example.com | GitHub: yourhandle"

# Scrape jobs through proxy
olj-cli --proxy "10.0.0.1:3128:admin:password123" jobs --filter "python" --pages 2

Debug

Enable debug logging for any command:

olj-cli --debug jobs --filter "react"

Example Workflow

# 1. Search for jobs (no authentication needed)
olj-cli jobs --filter "python developer" --pages 3

# 2. Login to get cookies (if you want to apply)
COOKIES=$(olj-cli login --email you@example.com --password secret)

# 3. Apply to a specific job
olj-cli apply \
  --cookies "$COOKIES" \
  --job-url "https://www.onlinejobs.ph/jobseekers/job/1604447" \
  --subject "Applying for Senior Developer" \
  --message "I would like to apply, thank you." \
  --contact-info "Email: you@example.com | GitHub: yourhandle"

How It Works

Login Flow

  1. Fetches the login page to extract CSRF token
  2. Submits credentials to authenticate endpoint
  3. Stores session cookies for subsequent requests

Apply Flow

  1. Fetches the job posting page
  2. Extracts CSRF token, job ID, and other metadata
  3. Fetches the application form
  4. Submits the application with subject, message, and contact info

Jobs Scraping

  1. Fetches job listing pages with optional keyword filter
  2. Parses job cards to extract title, URL, poster, and date
  3. Fetches each job's detail page to extract full description
  4. Returns complete job data as JSON

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

olj_cli-0.2.0.tar.gz (10.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

olj_cli-0.2.0-py3-none-any.whl (9.8 kB view details)

Uploaded Python 3

File details

Details for the file olj_cli-0.2.0.tar.gz.

File metadata

  • Download URL: olj_cli-0.2.0.tar.gz
  • Upload date:
  • Size: 10.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for olj_cli-0.2.0.tar.gz
Algorithm Hash digest
SHA256 aacf66271eb627147b09e9a8d11594ca797e575f99dd465df06fb4bc22fc200c
MD5 661f6ec140f6a31c9a78203dad444f58
BLAKE2b-256 60b931732bc5ed0abb7a6611bea4ec6b302789d44c7946fe87fce944de51d444

See more details on using hashes here.

File details

Details for the file olj_cli-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: olj_cli-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 9.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for olj_cli-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 4db243bd331066f44a94fda463b315dfd8947b7db5461e20cbc5d458939e9007
MD5 1df7edb9267609e78e35a9453465709a
BLAKE2b-256 ef9b6ef76901f1554e65e1ff945af32d4210d223379eb7706065fb9a451deff1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page