Skip to main content

Leveraging Artificial Intelligence for Skills Extraction and Research

Project description

[!CAUTION]

LAiSER is currently in development mode, features could be experimental. Use with caution!

Leveraging ​Artificial ​Intelligence for ​Skill ​Extraction &​ Research (LAiSER)

Contents

LAiSER is a tool that helps learners, educators and employers share trusted and mutually intelligible information about skills​.

About

LAiSER is an innovative tool that harnesses the power of artificial intelligence to simplify the extraction and analysis of skills. It is designed for learners, educators, and employers who want to gain reliable insights into skill sets, ensuring that the information shared is both trusted and mutually intelligible across various sectors.

By leveraging state-of-the-art AI models, LAiSER automates the process of identifying and classifying skills from diverse data sources. This not only saves time but also enhances accuracy, making it easier for users to discover emerging trends and in-demand skills.

The tool emphasizes standardization and transparency, offering a common framework that bridges the communication gap between different stakeholders. With LAiSER, educators can better align their teaching methods with industry requirements, and employers can more effectively identify the competencies required for their teams. The result is a more efficient and strategic approach to skill development, benefiting the entire ecosystem.

Requirements

  • Python version >= Python 3.9.
  • A GPU with atleast 15GB video memory is essential for running this tool on large datasets.

Setup and Installation

  • Install LAiSER using pip:

    For GPU support (recommended if you have a CUDA-capable GPU):

    pip install laiser[gpu]
    

    For CPU-only environments:

    pip install laiser[cpu]
    

    By default, torch and vllm GPU dependencies are included. Only when using the [cpu] extra will these GPU dependencies be excluded.

NOTE: Python 3.9 or later, preferably 3.12, is expected to be installed on your system. If you don't have Python installed, you can download it from here.

You can check if your machine has a GPU available with:

python -c "import torch; print(torch.cuda.is_available())"

Usage

As of now LAiSER can be used a python package in Google Colab or a local machine with GPU access. The steps to setup the tool are as follows:

Google Colab Setup

LAiSER's Jupyter notebook is, currently, the fastest way to get started with the tool. You can access the notebook here.

  • Once the notebook is imported in google colaboratory, connect to a GPU-accelerated runtime(T4 GPU) and run the cells in the notebook.

  • Sample code to import and verify laiser module

    Using the new refactored API (recommended):

    from laiser.skill_extractor_refactored import SkillExtractorRefactored
    print('\n\nInitializing the Skill Extractor...')
    # Replace 'your_model_id' and 'your_hf_token' with your actual credentials.
    model_id = "your_model_id"  # e.g., "microsoft/DialoGPT-medium"
    hf_token = "your_hf_token"
    use_gpu = True  # Change to False if you are not using a GPU
    se = SkillExtractorRefactored(model_id=model_id, hf_token=hf_token, use_gpu=use_gpu)
    print('The Skill Extractor has been initialized successfully!\n')
    print("LAiSER package loaded successfully!")
    

    Legacy API (backward compatibility):

    from laiser.skill_extractor import Skill_Extractor
    print('\n\nInitializing the Skill Extractor...')
    # Replace 'your_model_id' and 'your_hf_token' with your actual credentials.
    AI_MODEL_ID = "your_model_id"  # e.g., "bert-base-uncased"
    HF_TOKEN = "your_hf_token"
    use_gpu = True  # Change to False if you are not using a GPU
    se = Skill_Extractor(AI_MODEL_ID=AI_MODEL_ID, HF_TOKEN=HF_TOKEN, use_gpu=use_gpu)
    print('The Skill Extractor has been initialized successfully!\n')
    print("LAiSER package loaded successfully!")
    

Funding

Authors

Partners


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dev_laiser-0.3.3.1.tar.gz (46.7 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dev_laiser-0.3.3.1-py3-none-any.whl (46.7 MB view details)

Uploaded Python 3

File details

Details for the file dev_laiser-0.3.3.1.tar.gz.

File metadata

  • Download URL: dev_laiser-0.3.3.1.tar.gz
  • Upload date:
  • Size: 46.7 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.11

File hashes

Hashes for dev_laiser-0.3.3.1.tar.gz
Algorithm Hash digest
SHA256 85aafdff354fa053334a7bea0ed4acccb5ded3ea1277e9c62b29985f8c69f70d
MD5 bcaeab16f25515db7f38f48407009a71
BLAKE2b-256 916060aeb8ba41091eb45bc18125b51148939ed28e65c2a61cb0641a9afb5a43

See more details on using hashes here.

File details

Details for the file dev_laiser-0.3.3.1-py3-none-any.whl.

File metadata

  • Download URL: dev_laiser-0.3.3.1-py3-none-any.whl
  • Upload date:
  • Size: 46.7 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.11

File hashes

Hashes for dev_laiser-0.3.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 b742fa0e044f43ee18240251ce5bfd7221844e852f2fb239656b74c119d13c3b
MD5 e897173650e6f32544203c5508398531
BLAKE2b-256 99ca8ef63f2e24d579203539ff6f790fe53d60294e9c1d7439c0f37d1a036e4e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page