Skip to main content

A Python library for classifying toxic comments using deep learning.

Project description


# Toxic Comment Classifier

A Python library for classifying toxic comments using deep learning. It supports detecting multiple types of toxicity including obscene language, threats, and identity hate.

---

## 📦 Installation

```python
pip install toxic-comment-classifier

🚀 Usage

🔹 Import and Initialize the Model

from toxic_classifier.model import ToxicCommentClassifier

# Load the classifier
model = ToxicCommentClassifier()

🔹 Classify a Single Comment

text = "You are so dumb and stupid!"
scores = model.classify(text)

print("Toxicity Scores:", scores)

Example Output:

{
    "toxic": 0.85,
    "severe_toxic": 0.12,
    "obscene": 0.78,
    "threat": 0.05,
    "insult": 0.90,
    "identity_hate": 0.03
}

🔹 Get Overall Toxicity Score

toxicity = model.predict(text)
print(f"Overall Toxicity Score: {toxicity:.4f}")

🔹 Classify Multiple Comments

texts = [
    "I hate this!",
    "You're amazing!",
    "This is the worst thing ever!"
]

scores = model.predict_batch(texts)

for txt, score in zip(texts, scores):
    print(f"Text: {txt} --> Toxicity Score: {score:.4f}")

📄 License

This project is licensed under the MIT License.


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

toxic_comment_classifier-0.1.7.tar.gz (8.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

toxic_comment_classifier-0.1.7-py3-none-any.whl (8.1 MB view details)

Uploaded Python 3

File details

Details for the file toxic_comment_classifier-0.1.7.tar.gz.

File metadata

File hashes

Hashes for toxic_comment_classifier-0.1.7.tar.gz
Algorithm Hash digest
SHA256 8a4102c9fd5e86745dd37fac6b3ab09f3ccc60c45acc0fd032dd5fa91554382f
MD5 47fd291b94101ae73cc7bcd74d437405
BLAKE2b-256 1048b1b6f1342cb182e0f7bbfe55b1ef692233e30c83b3372c92479dd70b37fc

See more details on using hashes here.

File details

Details for the file toxic_comment_classifier-0.1.7-py3-none-any.whl.

File metadata

File hashes

Hashes for toxic_comment_classifier-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 142b0759e89fb66a9e83f107b25ebb387910931af5b2cf2c005bc2b39291f1fa
MD5 2b94948914620eed3b34ae9ed55e5c6c
BLAKE2b-256 0b7aa1a1ed05c01ff555552e454a1e9c347ae2fa64a0cb358b4f5084fd196cbe

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page