Skip to main content

Leveraging Artificial Intelligence for Skills Extraction and Research

Project description

[!CAUTION]

LAiSER is currently in development mode, features could be experimental. Use with caution!

Leveraging ​Artificial ​Intelligence for ​Skill ​Extraction &​ Research (LAiSER)

Contents

LAiSER is a tool that helps learners, educators and employers share trusted and mutually intelligible information about skills​.

About

LAiSER is an innovative tool that harnesses the power of artificial intelligence to simplify the extraction and analysis of skills. It is designed for learners, educators, and employers who want to gain reliable insights into skill sets, ensuring that the information shared is both trusted and mutually intelligible across various sectors.

By leveraging state-of-the-art AI models, LAiSER automates the process of identifying and classifying skills from diverse data sources. This not only saves time but also enhances accuracy, making it easier for users to discover emerging trends and in-demand skills.

The tool emphasizes standardization and transparency, offering a common framework that bridges the communication gap between different stakeholders. With LAiSER, educators can better align their teaching methods with industry requirements, and employers can more effectively identify the competencies required for their teams. The result is a more efficient and strategic approach to skill development, benefiting the entire ecosystem.

Requirements

  • Python version >= Python 3.9.
  • A GPU with atleast 15GB video memory is essential for running this tool on large datasets.

Setup and Installation

  • Install LAiSER using pip:

    For GPU support (recommended if you have a CUDA-capable GPU):

    pip install laiser[gpu]
    

    For CPU-only environments:

    pip install laiser[cpu]
    

    By default, torch and vllm GPU dependencies are included. Only when using the [cpu] extra will these GPU dependencies be excluded.

NOTE: Python 3.9 or later, preferably 3.12, is expected to be installed on your system. If you don't have Python installed, you can download it from here.

You can check if your machine has a GPU available with:

python -c "import torch; print(torch.cuda.is_available())"

Usage

As of now LAiSER can be used a python package in Google Colab or a local machine with GPU access. The steps to setup the tool are as follows:

Google Colab Setup

LAiSER's Jupyter notebook is, currently, the fastest way to get started with the tool. You can access the notebook here.

  • Once the notebook is imported in google colaboratory, connect to a GPU-accelerated runtime(T4 GPU) and run the cells in the notebook.

  • Sample code to import and verify laiser module

    Using the new refactored API (recommended):

    from laiser.skill_extractor_refactored import SkillExtractorRefactored
    print('\n\nInitializing the Skill Extractor...')
    # Replace 'your_model_id' and 'your_hf_token' with your actual credentials.
    model_id = "your_model_id"  # e.g., "microsoft/DialoGPT-medium"
    hf_token = "your_hf_token"
    use_gpu = True  # Change to False if you are not using a GPU
    se = SkillExtractorRefactored(model_id=model_id, hf_token=hf_token, use_gpu=use_gpu)
    print('The Skill Extractor has been initialized successfully!\n')
    print("LAiSER package loaded successfully!")
    

    Legacy API (backward compatibility):

    from laiser.skill_extractor import Skill_Extractor
    print('\n\nInitializing the Skill Extractor...')
    # Replace 'your_model_id' and 'your_hf_token' with your actual credentials.
    AI_MODEL_ID = "your_model_id"  # e.g., "bert-base-uncased"
    HF_TOKEN = "your_hf_token"
    use_gpu = True  # Change to False if you are not using a GPU
    se = Skill_Extractor(AI_MODEL_ID=AI_MODEL_ID, HF_TOKEN=HF_TOKEN, use_gpu=use_gpu)
    print('The Skill Extractor has been initialized successfully!\n')
    print("LAiSER package loaded successfully!")
    

Funding

Authors

Partners


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dev_anket_bharat_laiser-0.0.6.tar.gz (60.7 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dev_anket_bharat_laiser-0.0.6-py3-none-any.whl (60.8 MB view details)

Uploaded Python 3

File details

Details for the file dev_anket_bharat_laiser-0.0.6.tar.gz.

File metadata

  • Download URL: dev_anket_bharat_laiser-0.0.6.tar.gz
  • Upload date:
  • Size: 60.7 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for dev_anket_bharat_laiser-0.0.6.tar.gz
Algorithm Hash digest
SHA256 83eaa60f5d4041b31a8accecfda099577e1808b3bef4d4cb8b5e5794f1e0ea23
MD5 c7ab18d0ac9b72bdba6d00085a6fc0c1
BLAKE2b-256 119d9b5a2cf74b42f18dbcd33545210bdd406795bb194143b228e87c3a4bc1f4

See more details on using hashes here.

File details

Details for the file dev_anket_bharat_laiser-0.0.6-py3-none-any.whl.

File metadata

File hashes

Hashes for dev_anket_bharat_laiser-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 9051f2eb17cdaa5596f8c587de0a10bbca2f60ed14f596fd573548077140de4f
MD5 c0fafb4d9c24c04ad5307ee14f961db0
BLAKE2b-256 2b647cfd0b092752588e2fce90134d294495f89e8dfc90a5cad2868dd87c4b1e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page