Skip to main content

A Python library for classifying toxic comments using deep learning.

Project description


# Toxic Comment Classifier

A Python library for classifying toxic comments using deep learning. It supports detecting multiple types of toxicity including obscene language, threats, and identity hate.

---

## 📦 Installation

```python
pip install toxic-comment-classifier

🚀 Usage

🔹 Import and Initialize the Model

from toxic_classifier.model import ToxicCommentClassifier

# Load the classifier
model = ToxicCommentClassifier()

🔹 Classify a Single Comment

text = "You are so dumb and stupid!"
scores = model.classify(text)

print("Toxicity Scores:", scores)

Example Output:

{
    "toxic": 0.85,
    "severe_toxic": 0.12,
    "obscene": 0.78,
    "threat": 0.05,
    "insult": 0.90,
    "identity_hate": 0.03
}

🔹 Get Overall Toxicity Score

toxicity = model.predict(text)
print(f"Overall Toxicity Score: {toxicity:.4f}")

🔹 Classify Multiple Comments

texts = [
    "I hate this!",
    "You're amazing!",
    "This is the worst thing ever!"
]

scores = model.predict_batch(texts)

for txt, score in zip(texts, scores):
    print(f"Text: {txt} --> Toxicity Score: {score:.4f}")

📄 License

This project is licensed under the MIT License.


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

toxic_comment_classifier-0.1.6.tar.gz (3.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

toxic_comment_classifier-0.1.6-py3-none-any.whl (3.6 kB view details)

Uploaded Python 3

File details

Details for the file toxic_comment_classifier-0.1.6.tar.gz.

File metadata

File hashes

Hashes for toxic_comment_classifier-0.1.6.tar.gz
Algorithm Hash digest
SHA256 cf6e2f17b61e849989129c58752d167732771ea9bf90c35a230daa7aeb72beb8
MD5 ae2b51efe69cd1a986cc9cb82d243a66
BLAKE2b-256 bbb1d1ed5d155715b9fec83370d391d474dd76ac06cdc266f6b5454c7ec32ad6

See more details on using hashes here.

File details

Details for the file toxic_comment_classifier-0.1.6-py3-none-any.whl.

File metadata

File hashes

Hashes for toxic_comment_classifier-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 2ef5cd4f4081570e367bd97ed292ab341785b7ba03fa82cc6b4630efeb01c91a
MD5 796810ad464283dbde292687e03689c9
BLAKE2b-256 efffe63c9bbcfcef74915fda87e5c176fa94a4dc26b504c905ee4c3a1a75fa87

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page