Skip to main content

A Python library for classifying toxic comments using deep learning.

Project description

Updated README:

# Toxic Comment Classifier

A Python library for classifying toxic comments using deep learning.

## Installation

```bash
pip install toxic-comment-classifier
```

Usage

Initialize the Model

from toxic_comment_classifier import ToxicCommentClassifier

# Load the classifier
model = ToxicCommentClassifier()

Classify a Single Comment

text = "You are so dumb and stupid!"
scores = model.classify(text)

print("Toxicity Scores:", scores)

Output Example:

{
    "toxic": 0.85,
    "severe_toxic": 0.12,
    "obscene": 0.78,
    "threat": 0.05,
    "insult": 0.90,
    "identity_hate": 0.03
}

Get Overall Toxicity Score

toxicity = model.predict(text)
print(f"Overall Toxicity Score: {toxicity:.4f}")

Classify Multiple Comments (Batch Processing)

texts = [
    "I hate this!",
    "You're amazing!",
    "This is the worst thing ever!"
]

predictions = model.predict_batch(texts)

for txt, score in zip(texts, predictions):
    print(f"Text: {txt} --> Toxicity Score: {score:.4f}")

License

This project is licensed under the MIT License.


This keeps things clean, structured, and easy to follow! 🚀 Let me know if you need any modifications.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

toxic_comment_classifier-0.1.5.tar.gz (3.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

toxic_comment_classifier-0.1.5-py3-none-any.whl (3.6 kB view details)

Uploaded Python 3

File details

Details for the file toxic_comment_classifier-0.1.5.tar.gz.

File metadata

File hashes

Hashes for toxic_comment_classifier-0.1.5.tar.gz
Algorithm Hash digest
SHA256 25a274df450aca443d77934a0c57fdfc77f4de79e3c4bbb6627055009dea4ae4
MD5 4b2a27d8a405bad88af7d6d2e5fc8451
BLAKE2b-256 2429ceac784a981602110ac3a8154f6bef33c29736cd8e057d1d0f3f9ec3ea81

See more details on using hashes here.

File details

Details for the file toxic_comment_classifier-0.1.5-py3-none-any.whl.

File metadata

File hashes

Hashes for toxic_comment_classifier-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 036fe7fd76ee15ed7530b2f28443e1ceabb7732886afaac3c2d886d40e430b67
MD5 ee7d41db9d1c8d157ac17c992c9b64a7
BLAKE2b-256 73138f70f609e74309f109b17eec3b99b7660d497b432e40ab433f96ad3057d1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page