Skip to main content

A Python library for classifying toxic comments using deep learning.

Project description

# Toxic Comment Classifier

A Python library for classifying toxic comments using deep learning. It supports detecting multiple types of toxicity including obscene language, threats, and identity hate.

---

## 📦 Installation

```bash
pip install toxic-comment-classifier
```

🚀 Usage

🔹 Import and Initialize the Model

from toxic_classifier.model import ToxicCommentClassifier

# Load the classifier
model = ToxicCommentClassifier()

🔹 Classify a Single Comment

text = "You are so dumb and stupid!"
scores = model.classify(text)

print("Toxicity Scores:", scores)

Example Output:

{
    'toxic': 0.5004,
    'severe_toxic': 0.4987,
    'obscene': 0.4989,
    'threat': 0.5021,
    'insult': 0.4979,
    'identity_hate': 0.5006
}

🔹 Get Overall Toxicity Score

toxicity = model.predict(text)
print(f"Overall Toxicity Score: {toxicity:.4f}")

Example Output:

Overall Toxicity Score: 0.4998

🔹 Classify Multiple Comments

texts = [
    "I hate this!",
    "You're amazing!",
    "This is the worst thing ever!"
]

scores = model.predict_batch(texts)

for txt, score in zip(texts, scores):
    print(f"Text: {txt} --> Toxicity Score: {score:.4f}")

Example Output:

Text: I hate this! --> Toxicity Score: 0.5002
Text: You're amazing! --> Toxicity Score: 0.5000
Text: This is the worst thing ever! --> Toxicity Score: 0.5008

📄 License

This project is licensed under the MIT License.


---

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

toxic_comment_classifier-0.2.0.tar.gz (8.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

toxic_comment_classifier-0.2.0-py3-none-any.whl (8.1 MB view details)

Uploaded Python 3

File details

Details for the file toxic_comment_classifier-0.2.0.tar.gz.

File metadata

File hashes

Hashes for toxic_comment_classifier-0.2.0.tar.gz
Algorithm Hash digest
SHA256 c2566474a614af04fbb588fb180435e615603c6fec66f4f67f5265a6833c66e3
MD5 d0fb2f24a2436d00c6cc9626862873cb
BLAKE2b-256 bb2c8979c47717b1fb677593e6715ad349941df748feba5b55d082971f82e5f9

See more details on using hashes here.

File details

Details for the file toxic_comment_classifier-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for toxic_comment_classifier-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 8c36e6637697c84836fec1ba198a532eff96f68774835e36709af6207313cd8a
MD5 4ed8ea8ac3f8b727802d4576b9168bd6
BLAKE2b-256 f2d25d1ac1978682c5e17001d545c2f572e2c379987875fce8a861b56b50613f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page