A Python library for classifying toxic comments using deep learning.
Project description
# Toxic Comment Classifier
A Python library for classifying toxic comments using deep learning.
It supports detecting multiple types of toxicity including obscene language, threats, and identity hate.
📦 Installation
pip install toxic-comment-classifier
🚀 Usage
🔹 Import and Initialize the Model
from toxic_classifier.model import ToxicCommentClassifier
# Load the classifier
model = ToxicCommentClassifier()
🔹 Classify a Single Comment
text = "You are so dumb and stupid!"
scores = model.classify(text)
print("Toxicity Scores:", scores)
Example Output:
{
'toxic': 0.5004,
'severe_toxic': 0.4987,
'obscene': 0.4989,
'threat': 0.5021,
'insult': 0.4979,
'identity_hate': 0.5006
}
🔹 Get Overall Toxicity Score
toxicity = model.predict(text)
print(f"Overall Toxicity Score: {toxicity:.4f}")
Example Output:
Overall Toxicity Score: 0.4998
🔹 Classify Multiple Comments
texts = [
"I hate this!",
"You're amazing!",
"This is the worst thing ever!"
]
scores = model.predict_batch(texts)
for txt, score in zip(texts, scores):
print(f"Text: {txt} --> Toxicity Score: {score:.4f}")
Example Output:
Text: I hate this! --> Toxicity Score: 0.5002
Text: You're amazing! --> Toxicity Score: 0.5000
Text: This is the worst thing ever! --> Toxicity Score: 0.5008
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file toxic_comment_classifier-0.2.1.tar.gz.
File metadata
- Download URL: toxic_comment_classifier-0.2.1.tar.gz
- Upload date:
- Size: 8.1 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d6461423f28594220aea4206d18189ea826859822c983b3e120cfb9d18b0ff2a
|
|
| MD5 |
4f70eb0f00c9d3acd3fa5e80d3166a2a
|
|
| BLAKE2b-256 |
76cf7e2204298a832baf700c47eca0476e2b88db6af12c4e8dc5871bd2ef8ddf
|
File details
Details for the file toxic_comment_classifier-0.2.1-py3-none-any.whl.
File metadata
- Download URL: toxic_comment_classifier-0.2.1-py3-none-any.whl
- Upload date:
- Size: 8.1 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e40de905de5b716f4e8be83528c78b56f172a9abafa7c0a5c1d88bbeda18f2b4
|
|
| MD5 |
3c921e98f0bf768317eb26c2b36259eb
|
|
| BLAKE2b-256 |
2d286eabb91bf1737520563251368ba72cc0712e08dc6c42a3cdf15fd83a44f8
|