Skip to main content

A Python library for classifying toxic comments using deep learning. It supports detecting multiple types of toxicity including obscene language, threats, and identity hate.

Project description

Toxic Comment Classifier

GitHub license Python version PyPI version Downloads

# Toxic Comment Classifier

A Python library for classifying toxic comments using deep learning.
It supports detecting multiple types of toxicity including obscene language, threats, and identity hate.

📦 Installation

pip install toxic-comment-classifier

🚀 Usage

🔹 Import and Initialize the Model

from toxic_classifier.model import ToxicCommentClassifier

# Load the classifier
model = ToxicCommentClassifier()

🔹 Classify a Single Comment

text = "You are so dumb and stupid!"
scores = model.classify(text)

scores

Example Output:

{'toxic': 0.9889402985572815,
 'severe_toxic': 0.07256772369146347,
 'obscene': 0.620429277420044,
 'threat': 0.01934845559298992,
 'insult': 0.8664075136184692,
 'identity_hate': 0.04072948172688484}

🔹 Get Overall Toxicity Score

toxicity = model.predict(text)
print(f"Overall Toxicity Score: {toxicity:.4f}")

Example Output:

Overall Toxicity Score: 0.4347

🔹 Classify Multiple Comments

texts = [
    "I hate you so much!",
    "This is wonderful news.",
    "You're disgusting!",
    "Absolutely love your energy!",
    "You're the worst person ever!",
    "Have a nice day :)"
]

scores = model.predict_batch(texts)

for txt, score in zip(texts, scores):
    print(f"Text: {txt} --> Toxicity Score: {score:.4f}")

Example Output:

Text: I hate you so much! --> Toxicity Score: 0.1395
Text: This is wonderful news. --> Toxicity Score: 0.0013
Text: You're disgusting! --> Toxicity Score: 0.3110
Text: Absolutely love your energy! --> Toxicity Score: 0.0088
Text: You're the worst person ever! --> Toxicity Score: 0.0937
Text: Have a nice day :) --> Toxicity Score: 0.0115

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

toxic_comment_classifier-0.3.0.tar.gz (8.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

toxic_comment_classifier-0.3.0-py3-none-any.whl (8.1 MB view details)

Uploaded Python 3

File details

Details for the file toxic_comment_classifier-0.3.0.tar.gz.

File metadata

File hashes

Hashes for toxic_comment_classifier-0.3.0.tar.gz
Algorithm Hash digest
SHA256 feed3b165a996c59558f8e810308a19cc738e5d259a3b0a571c97754c2b188b3
MD5 33c789fb4192c3dbce08b9d9a35ea328
BLAKE2b-256 81cb841d6b59f98c09df9bebc43e25db7131d390a2dd749986ddc6b5e3fcf05a

See more details on using hashes here.

File details

Details for the file toxic_comment_classifier-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for toxic_comment_classifier-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e5af0f052dbb0493d585b5b486dd9b61ec13d54c0e3e2ad93c519114bb404948
MD5 980174b366ec5bae5ca3715320abfcb4
BLAKE2b-256 f03f1eb498ac0301ae2e56b4ff3daec7f5c892fd81ec01755fb4d0d728eacdcf

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page