A Python library for classifying toxic comments using deep learning. It supports detecting multiple types of toxicity including obscene language, threats, and identity hate.
Project description
Toxic Comment Classifier
# Toxic Comment Classifier
A Python library for classifying toxic comments using deep learning.
It supports detecting multiple types of toxicity including obscene language, threats, and identity hate.
📦 Installation
pip install toxic-comment-classifier
🚀 Usage
🔹 Import and Initialize the Model
from toxic_classifier.model import ToxicCommentClassifier
# Load the classifier
model = ToxicCommentClassifier()
🔹 Classify a Single Comment
text = "You are so dumb and stupid!"
scores = model.classify(text)
scores
Example Output:
{'toxic': 0.9889402985572815,
'severe_toxic': 0.07256772369146347,
'obscene': 0.620429277420044,
'threat': 0.01934845559298992,
'insult': 0.8664075136184692,
'identity_hate': 0.04072948172688484}
🔹 Get Overall Toxicity Score
toxicity = model.predict(text)
print(f"Overall Toxicity Score: {toxicity:.4f}")
Example Output:
Overall Toxicity Score: 0.4347
🔹 Classify Multiple Comments
texts = [
"I hate you so much!",
"This is wonderful news.",
"You're disgusting!",
"Absolutely love your energy!",
"You're the worst person ever!",
"Have a nice day :)"
]
scores = model.predict_batch(texts)
for txt, score in zip(texts, scores):
print(f"Text: {txt} --> Toxicity Score: {score:.4f}")
Example Output:
Text: I hate you so much! --> Toxicity Score: 0.1395
Text: This is wonderful news. --> Toxicity Score: 0.0013
Text: You're disgusting! --> Toxicity Score: 0.3110
Text: Absolutely love your energy! --> Toxicity Score: 0.0088
Text: You're the worst person ever! --> Toxicity Score: 0.0937
Text: Have a nice day :) --> Toxicity Score: 0.0115
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file toxic_comment_classifier-0.3.0.tar.gz.
File metadata
- Download URL: toxic_comment_classifier-0.3.0.tar.gz
- Upload date:
- Size: 8.1 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
feed3b165a996c59558f8e810308a19cc738e5d259a3b0a571c97754c2b188b3
|
|
| MD5 |
33c789fb4192c3dbce08b9d9a35ea328
|
|
| BLAKE2b-256 |
81cb841d6b59f98c09df9bebc43e25db7131d390a2dd749986ddc6b5e3fcf05a
|
File details
Details for the file toxic_comment_classifier-0.3.0-py3-none-any.whl.
File metadata
- Download URL: toxic_comment_classifier-0.3.0-py3-none-any.whl
- Upload date:
- Size: 8.1 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e5af0f052dbb0493d585b5b486dd9b61ec13d54c0e3e2ad93c519114bb404948
|
|
| MD5 |
980174b366ec5bae5ca3715320abfcb4
|
|
| BLAKE2b-256 |
f03f1eb498ac0301ae2e56b4ff3daec7f5c892fd81ec01755fb4d0d728eacdcf
|