Module encoding and encrypting text by key
Project description
ToxicityClassificator
Module encoding and encrypting text by key
Usage example
from toxicityclassifier import *
classifier = ToxicityClassificator()
print(classifier.predict(text)) # (0 or 1, probability)
print(classifier.get_probability(text)) # probability
print(classifier.classify(text)) # 0 or 1
Weights
Weight for classification (if probability >= weight => 1 else 0)
classifier.weight = 0.5
Weight for language detection (English or Russian)
if the percentage of the Russian language >= language_weight, then the Russian model is used, otherwise the English one
classifier.language_weight = 0.5
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for toxicityclassifier-0.1.9-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2f405bd1c2a9e091046f345b4b0383ecabad4963d2a162954327a345ca326dba |
|
MD5 | 63cd95f0af0467f6202ac2391703d7b4 |
|
BLAKE2b-256 | eab8b8693d288713efa3acb89752f7bed271e572a3c2dec24a78355db60b9da4 |