Skip to main content

A Python library for classifying toxic comments using deep learning. It supports detecting multiple types of toxicity including obscene language, threats, and identity hate.

Project description

Toxic Comment Classifier

GitHub license Python version PyPI version Downloads

# Toxic Comment Classifier

A Python library for classifying toxic comments using deep learning.
It supports detecting multiple types of toxicity including obscene language, threats, and identity hate.

📦 Installation

pip install toxic-comment-classifier

🚀 Usage

🔹 Import and Initialize the Model

from toxic_classifier.model import ToxicCommentClassifier

# Load the classifier
model = ToxicCommentClassifier()

🔹 Classify a Single Comment

text = "You are so dumb and stupid!"
scores = model.classify(text)

print("Toxicity Scores:", scores)

Example Output:

{
    'toxic': 0.5004,
    'severe_toxic': 0.4987,
    'obscene': 0.4989,
    'threat': 0.5021,
    'insult': 0.4979,
    'identity_hate': 0.5006
}

🔹 Get Overall Toxicity Score

toxicity = model.predict(text)
print(f"Overall Toxicity Score: {toxicity:.4f}")

Example Output:

Overall Toxicity Score: 0.4998

🔹 Classify Multiple Comments

texts = [
    "I hate this!",
    "You're amazing!",
    "This is the worst thing ever!"
]

scores = model.predict_batch(texts)

for txt, score in zip(texts, scores):
    print(f"Text: {txt} --> Toxicity Score: {score:.4f}")

Example Output:

Text: I hate this! --> Toxicity Score: 0.5002
Text: You're amazing! --> Toxicity Score: 0.5000
Text: This is the worst thing ever! --> Toxicity Score: 0.5008

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

toxic_comment_classifier-0.2.4.tar.gz (8.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

toxic_comment_classifier-0.2.4-py3-none-any.whl (8.1 MB view details)

Uploaded Python 3

File details

Details for the file toxic_comment_classifier-0.2.4.tar.gz.

File metadata

File hashes

Hashes for toxic_comment_classifier-0.2.4.tar.gz
Algorithm Hash digest
SHA256 8fe24cacec8c9886a7eee2e741e916c1931f35ec16f82b70e20b342d1182e4e7
MD5 11f0278bf1613345dd4affaa4e704017
BLAKE2b-256 034544b9887f385861d57c0efc9a8719dec237d6c01acae4176347400af854b9

See more details on using hashes here.

File details

Details for the file toxic_comment_classifier-0.2.4-py3-none-any.whl.

File metadata

File hashes

Hashes for toxic_comment_classifier-0.2.4-py3-none-any.whl
Algorithm Hash digest
SHA256 24d0a2737bea0f9a834599940d25433d0eae6cf55f81f572c4fd5fa6bebec48e
MD5 ae7596eee6b9953cae841074a60e44ad
BLAKE2b-256 31afb5fc5bed053d428b27fe76e5a217d15f3bb5cf8303140c3e6192bed12bbc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page