Skip to main content

A Python library for classifying toxic comments using deep learning.

Project description

Here's a clean version of your README.md:

# Toxic Comment Classifier

A Python library for classifying toxic comments using deep learning.

## Installation

```bash
pip install toxic-comment-classifier
```

Usage

Initialize the Model

from toxic_comment_classifier import ToxicCommentClassifier

# Load the classifier
model = ToxicCommentClassifier()

Classify a Single Comment

text = "You are so dumb and stupid!"
scores = model.classify(text)

print("Toxicity Scores:", scores)

Output Example:

{
    "toxic": 0.85,
    "severe_toxic": 0.12,
    "obscene": 0.78,
    "threat": 0.05,
    "insult": 0.90,
    "identity_hate": 0.03
}

Get Overall Toxicity Score

toxicity = model.predict(text)
print(f"Overall Toxicity Score: {toxicity:.4f}")

Classify Multiple Comments (Batch Processing)

texts = [
    "I hate this!",
    "You're amazing!",
    "This is the worst thing ever!"
]

predictions = model.predict_batch(texts)

for txt, score in zip(texts, predictions):
    print(f"Text: {txt} --> Toxicity Score: {score:.4f}")

License

This project is licensed under the MIT License.

This keeps things clean, structured, and easy to follow! 🚀 Let me know if you need any modifications.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

toxic_comment_classifier-0.1.1.tar.gz (3.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

toxic_comment_classifier-0.1.1-py3-none-any.whl (3.5 kB view details)

Uploaded Python 3

File details

Details for the file toxic_comment_classifier-0.1.1.tar.gz.

File metadata

File hashes

Hashes for toxic_comment_classifier-0.1.1.tar.gz
Algorithm Hash digest
SHA256 3ec53fe5238b4e9771a85d4cfb2f4890d313b7ff9689504b5bdcb221c31e4bc8
MD5 77d1b3fea01b3a67a1db6a9f81236e64
BLAKE2b-256 e8c45520e2d9f29e4187e5f8b620e51b671241c1bc6e7b4748e2d82e9398eb1c

See more details on using hashes here.

File details

Details for the file toxic_comment_classifier-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for toxic_comment_classifier-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 9148ef74a78bbf6e42530c46daeae578fb4ffbac6e6f245fe36f2b8676915bd7
MD5 841994d427fa98df75513a355d2cf028
BLAKE2b-256 7eec6d5dc3e8e854b661e089667f1d4f679c42a6120a58e3077f2ddd8f5725a3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page