Skip to main content

A Python library for classifying toxic comments using deep learning.

Project description

# Toxic Comment Classifier

A Python library for classifying toxic comments using deep learning.
It supports detecting multiple types of toxicity including obscene language, threats, and identity hate.

📦 Installation

pip install toxic-comment-classifier

🚀 Usage

🔹 Import and Initialize the Model

from toxic_classifier.model import ToxicCommentClassifier

# Load the classifier
model = ToxicCommentClassifier()

🔹 Classify a Single Comment

text = "You are so dumb and stupid!"
scores = model.classify(text)

print("Toxicity Scores:", scores)

Example Output:

{
    'toxic': 0.5004,
    'severe_toxic': 0.4987,
    'obscene': 0.4989,
    'threat': 0.5021,
    'insult': 0.4979,
    'identity_hate': 0.5006
}

🔹 Get Overall Toxicity Score

toxicity = model.predict(text)
print(f"Overall Toxicity Score: {toxicity:.4f}")

Example Output:

Overall Toxicity Score: 0.4998

🔹 Classify Multiple Comments

texts = [
    "I hate this!",
    "You're amazing!",
    "This is the worst thing ever!"
]

scores = model.predict_batch(texts)

for txt, score in zip(texts, scores):
    print(f"Text: {txt} --> Toxicity Score: {score:.4f}")

Example Output:

Text: I hate this! --> Toxicity Score: 0.5002
Text: You're amazing! --> Toxicity Score: 0.5000
Text: This is the worst thing ever! --> Toxicity Score: 0.5008

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

toxic_comment_classifier-0.2.1.tar.gz (8.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

toxic_comment_classifier-0.2.1-py3-none-any.whl (8.1 MB view details)

Uploaded Python 3

File details

Details for the file toxic_comment_classifier-0.2.1.tar.gz.

File metadata

File hashes

Hashes for toxic_comment_classifier-0.2.1.tar.gz
Algorithm Hash digest
SHA256 d6461423f28594220aea4206d18189ea826859822c983b3e120cfb9d18b0ff2a
MD5 4f70eb0f00c9d3acd3fa5e80d3166a2a
BLAKE2b-256 76cf7e2204298a832baf700c47eca0476e2b88db6af12c4e8dc5871bd2ef8ddf

See more details on using hashes here.

File details

Details for the file toxic_comment_classifier-0.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for toxic_comment_classifier-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e40de905de5b716f4e8be83528c78b56f172a9abafa7c0a5c1d88bbeda18f2b4
MD5 3c921e98f0bf768317eb26c2b36259eb
BLAKE2b-256 2d286eabb91bf1737520563251368ba72cc0712e08dc6c42a3cdf15fd83a44f8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page