Module encoding and encrypting text by key
Project description
ToxicityClassificator
Module encoding and encrypting text by key
Usage example
from toxicityclassifier import *
classifier = ToxicityClassificator()
print(classifier.predict(text)) # (0 or 1, probability)
print(classifier.get_probability(text)) # probability
print(classifier.classify(text)) # 0 or 1
Weights
Weight for classification (if probability >= weight => 1 else 0)
classifier.weight = 0.5
Weight for language detection (English or Russian)
if the percentage of the Russian language >= language_weight, then the Russian model is used, otherwise the English one
classifier.language_weight = 0.5
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for toxicityclassifier-0.1.8-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8d873b6b503688d8ebb629904a45ef3585510b737e8fd4e4ba33cef170ea3c6b |
|
MD5 | fa72ebc697486de749ebbcc56e177b45 |
|
BLAKE2b-256 | 63365bc95b0e03019574fe359c3c33ca53dd39daaa930146caa09a4d337f7318 |