Skip to main content

BERT model fine-tuned on chilean STEM lessons

Project description

BERT-STEM

BERT model fine-tuned on Science Technology Engineering and Mathematics (STEM) lessons.

Install:

To install from pip:

pip install bertstem

Quickstart

To encode sentences :

from BERT_STEM.BertSTEM import *
bert = BertSTEM()

# Example dataframe with text in spanish
data = {'col_1': [3, 2, 1], 
'col_2': ['hola como estan', 'alumnos queridos', 'vamos a hablar de matematicas']}

df = pd.DataFrame.from_dict(data)

# Encode sentences using BertSTEM:
bert._encode_df(df, column='col_2', encoding='sum')

To classify sentences with COPUS models:

from BERT_STEM.BertSTEM import *

# Download BERT for classification (guiding/presenting/administration)
bert_classification = BertSTEMForTextClassification(2, model_name = 'pablouribe/bertstem-copus-guiding')

# Example dataframe with text in spanish
data = {'col_1': [3, 2, 1], 
'col_2': ['hola como estan', 'alumnos queridos', 'vamos a hablar de matematicas']}

df = pd.DataFrame.from_dict(data)

# Classify sentences using BertSTEM for COPUS (Guiding):
bert_classification.predict(df,'col_2')

To use it from HuggingFace:

from BERT_STEM.Encode import *
import pandas as pd
import transformers

# Download spanish BERTSTEM:
model = transformers.BertModel.from_pretrained("pablouribe/bertstem")

# Download spanish tokenizer:
tokenizer = transformers.BertTokenizerFast.from_pretrained("dccuchile/bert-base-spanish-wwm-uncased",
                                                            do_lower_case=True, 
                                                            add_special_tokens = False)

# Example dataframe with text in spanish
data = {'col_1': [3, 2, 1], 
        'col_2': ['hola como estan', 'alumnos queridos', 'vamos a hablar de matematicas']}
        
df = pd.DataFrame.from_dict(data)

# Encode sentences using BertSTEM:
sentence_encoder(df, model, tokenizer, column = 'col_2', encoding = 'sum')

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bertstem-0.0.33.tar.gz (7.1 kB view details)

Uploaded Source

Built Distribution

bertstem-0.0.33-py3-none-any.whl (7.7 kB view details)

Uploaded Python 3

File details

Details for the file bertstem-0.0.33.tar.gz.

File metadata

  • Download URL: bertstem-0.0.33.tar.gz
  • Upload date:
  • Size: 7.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/3.10.0 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.0 CPython/3.8.5

File hashes

Hashes for bertstem-0.0.33.tar.gz
Algorithm Hash digest
SHA256 aacc5159cd20d1f137ffd7553381268462001915cba9bf099713784995045f41
MD5 12247f399dc48cfdd362df25dc78f6c1
BLAKE2b-256 e14485fa556d35d908304190dcf2fc5249f07f25d3d48570ce7fdf6529c29b36

See more details on using hashes here.

File details

Details for the file bertstem-0.0.33-py3-none-any.whl.

File metadata

  • Download URL: bertstem-0.0.33-py3-none-any.whl
  • Upload date:
  • Size: 7.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/3.10.0 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.0 CPython/3.8.5

File hashes

Hashes for bertstem-0.0.33-py3-none-any.whl
Algorithm Hash digest
SHA256 98bf63d09ca126763da0d5a709b4cda1018173d2a25f2093b2ab9764a41483cf
MD5 458df06fc5fc30101bc4e48372b1b02f
BLAKE2b-256 c6c12053cf4f5fff3f58debba32481ba262beac592e2997ef3aa28ee56c02faf

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page