Skip to main content

A Python library for classifying toxic comments using deep learning.

Project description


# Toxic Comment Classifier

A Python library for classifying toxic comments using deep learning. It supports detecting multiple types of toxicity including obscene language, threats, and identity hate.

---

## 📦 Installation

```python
pip install toxic-comment-classifier

🚀 Usage

🔹 Import and Initialize the Model

from toxic_classifier.model import ToxicCommentClassifier

# Load the classifier
model = ToxicCommentClassifier()

🔹 Classify a Single Comment

text = "You are so dumb and stupid!"
scores = model.classify(text)

print("Toxicity Scores:", scores)

Example Output:

{'toxic': 0.5003802180290222,
 'severe_toxic': 0.4986536502838135,
 'obscene': 0.4989285469055176,
 'threat': 0.5020793676376343,
 'insult': 0.49787813425064087,
 'identity_hate': 0.5006254315376282}

🔹 Get Overall Toxicity Score

toxicity = model.predict(text)
print(f"Overall Toxicity Score: {toxicity:.4f}")
Overall Toxicity Score: 0.4998

🔹 Classify Multiple Comments

texts = [
    "I hate this!",
    "You're amazing!",
    "This is the worst thing ever!"
]

scores = model.predict_batch(texts)

for txt, score in zip(texts, scores):
    print(f"Text: {txt} --> Toxicity Score: {score:.4f}")
Text: I hate this! --> Toxicity Score: 0.5002
Text: You're amazing! --> Toxicity Score: 0.5000
Text: This is the worst thing ever! --> Toxicity Score: 0.5008

📄 License

This project is licensed under the MIT License.


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

toxic_comment_classifier-0.1.8.tar.gz (8.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

toxic_comment_classifier-0.1.8-py3-none-any.whl (8.1 MB view details)

Uploaded Python 3

File details

Details for the file toxic_comment_classifier-0.1.8.tar.gz.

File metadata

File hashes

Hashes for toxic_comment_classifier-0.1.8.tar.gz
Algorithm Hash digest
SHA256 dec30778165905aa1aab1616f38c16aff7d54190f60d1cd44ba2fe473c9d8076
MD5 9254379824be502a0879edf6f9e1b293
BLAKE2b-256 bdcf66c5c537d3cfab546bd644e52bcf9b532f6537bc8bf1275a77c0ed669adb

See more details on using hashes here.

File details

Details for the file toxic_comment_classifier-0.1.8-py3-none-any.whl.

File metadata

File hashes

Hashes for toxic_comment_classifier-0.1.8-py3-none-any.whl
Algorithm Hash digest
SHA256 e5f798a96a448b87d8d310962c8ddc69f2d074dc38718cde6a3464cccd514d2e
MD5 6939060dff46b236831dbac97f5355ad
BLAKE2b-256 fb172992e92d5099fcc8e68ae3d74ade0fee11ca4a2cc84fc4b00f9897b40f11

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page