Skip to main content

A Python library for classifying toxic comments using deep learning.

Project description

# Toxic Comment Classifier

A Python library for classifying toxic comments using deep learning. It supports detecting multiple types of toxicity including obscene language, threats, and identity hate.

---

## 📦 Installation

```bash
pip install toxic-comment-classifier
```

🚀 Usage

🔹 Import and Initialize the Model

from toxic_classifier.model import ToxicCommentClassifier

# Load the classifier
model = ToxicCommentClassifier()

🔹 Classify a Single Comment

text = "You are so dumb and stupid!"
scores = model.classify(text)

print("Toxicity Scores:", scores)

Example Output:

{
    'toxic': 0.5004,
    'severe_toxic': 0.4987,
    'obscene': 0.4989,
    'threat': 0.5021,
    'insult': 0.4979,
    'identity_hate': 0.5006
}

🔹 Get Overall Toxicity Score

toxicity = model.predict(text)
print(f"Overall Toxicity Score: {toxicity:.4f}")

Example Output:

Overall Toxicity Score: 0.4998

🔹 Classify Multiple Comments

texts = [
    "I hate this!",
    "You're amazing!",
    "This is the worst thing ever!"
]

scores = model.predict_batch(texts)

for txt, score in zip(texts, scores):
    print(f"Text: {txt} --> Toxicity Score: {score:.4f}")

Example Output:

Text: I hate this! --> Toxicity Score: 0.5002
Text: You're amazing! --> Toxicity Score: 0.5000
Text: This is the worst thing ever! --> Toxicity Score: 0.5008

📄 License

This project is licensed under the MIT License.


---

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

toxic_comment_classifier-0.1.9.tar.gz (8.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

toxic_comment_classifier-0.1.9-py3-none-any.whl (8.1 MB view details)

Uploaded Python 3

File details

Details for the file toxic_comment_classifier-0.1.9.tar.gz.

File metadata

File hashes

Hashes for toxic_comment_classifier-0.1.9.tar.gz
Algorithm Hash digest
SHA256 d6b8d510e540e7bc807136f059797e354ceaa216e779e5292e862800fb2489df
MD5 0fb9fcff2870c0364736704d9340f428
BLAKE2b-256 607394b902ae8ce0cf00c2008e6a0fc288e8c809b53fa328ca8d7f8978dbe30a

See more details on using hashes here.

File details

Details for the file toxic_comment_classifier-0.1.9-py3-none-any.whl.

File metadata

File hashes

Hashes for toxic_comment_classifier-0.1.9-py3-none-any.whl
Algorithm Hash digest
SHA256 a7be8cb0426171e2e981f3793b76ce1060ce1f985071d2bfb4d213f017b7b7e1
MD5 95cbe289d4d83a6f9cd574e5fa54e12f
BLAKE2b-256 faf043e336d6d732d45d557c925b7daeba9ff52dbf78cf0e2e685038afa08d27

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page