Skip to main content

A tool to scrape infomation of people in a particular based on their job title and location

Project description

LinkedIn Auto Scraper

LinkedIn Auto Scraper is a command-line application designed to scrape cleaned information from LinkedIn profiles based on job titles and locations. The app offers two main commands: login and scrape. Once you've logged in using the login command, you can perform continuous scraping without the need to log in again, thanks to the authentication caching feature.

Table of Contents

Installation

  1. Ensure you have Python 3.x installed.

  2. Clone this repository:

  • git clone https://github.com/your-username/linkedin-auto-scraper.git

  • Navigate to the project directory:

  • cd linkedin-auto-scraper

  • Install the required dependencies

    pip install -r requirements.txt

Usage

Login

To use the app, you need to log in first. This step is required for authentication purposes. Run the following command:

bash

linkedin-auto-scraper login

This command lets you log in to your LinkedIn account. Follow the prompts to provide your LinkedIn credentials. Your authentication will be cached to allow continuous scraping without repeated logins.

Scrape

Once you're logged in, you can start scraping LinkedIn profiles based on job titles and locations. Run the following command:

bash

linkedin-auto-scraper scrape

By default, the app will search for profiles related to the HR field. You can customize the search by using the following options:

  • --search (-s): Specify the search parameter (default: "hr").
  • --location (-l): Use it to search for profiles in a specific location (default: None).
  • --excel: This option is required and indicates whether to generate an Excel file with the scraped data (default: no-excel).

Example usage:

bash

linkedin-auto-scraper scrape --search software engineer --location San Francisco --excel

Follow the prompts to enter the desired job title and location. The app will scrape and display cleaned information from LinkedIn profiles matching your criteria.

Contributing

Contributions are welcome! If you'd like to contribute to the project, please follow these steps:

  1. Fork the repository.
  2. Create a new branch for your feature/bugfix: git checkout -b feature-name
  3. Commit your changes: git commit -m "Description of changes"
  4. Push to the branch: git push origin feature-name
  5. Create a pull request.

License

This project is licensed under the [MIT License]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

linkedin_auto_scraper-0.2.1.tar.gz (7.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

linkedin_auto_scraper-0.2.1-py3-none-any.whl (7.8 kB view details)

Uploaded Python 3

File details

Details for the file linkedin_auto_scraper-0.2.1.tar.gz.

File metadata

  • Download URL: linkedin_auto_scraper-0.2.1.tar.gz
  • Upload date:
  • Size: 7.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.1 CPython/3.10.6 Linux/5.15.0-73-generic

File hashes

Hashes for linkedin_auto_scraper-0.2.1.tar.gz
Algorithm Hash digest
SHA256 58ae1e5c1c4285a3518b2c993be2a5bc91d71d17bb53667a3b721db71eef4eba
MD5 21747eb64b8e2c4e333dfcc7ca899080
BLAKE2b-256 3881d3983cd9d51d88f9899c15f2cb458c71546991e9c1315309b8de777700d9

See more details on using hashes here.

File details

Details for the file linkedin_auto_scraper-0.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for linkedin_auto_scraper-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 d173f3e68df3b8c54914eacd2803361fd76a71bdb900933856bffc1f3e0bfb52
MD5 249eedfd22aae9b63468e123ea1112c2
BLAKE2b-256 759188bf0ccdfd76d28890c50f03af8c328c37249ee68f30fcaad3a548be18a4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page