NSFW pipeline that classifies prompt, using a bag-of-words model
Project description
NSFW pipeline that classifies prompt, using a bag-of-words model
Feature | Description |
— | — |
Name | en_prompt_nsfw_pipeline_bow |
Version | 0.1.1 |
spaCy | >=3.0.0,<4.0.0 |
Default Pipeline | textcat |
Components | textcat |
Vectors | 0 keys, 0 unique vectors (0 dimensions) |
Sources | n/a |
License | UNLICENSED |
Author | [Jiayu Liu]() |
### Label Scheme
<details>
<summary>View label scheme (4 labels for 1 components)</summary>
Component | Labels |
— | — |
`textcat` | adult, cp, underage_safe, safe |
</details>
### Accuracy
Type | Score |
— | — |
CATS_SCORE | 88.95 |
CATS_MICRO_P | 89.02 |
CATS_MICRO_R | 89.02 |
CATS_MICRO_F | 89.02 |
CATS_MACRO_P | 89.27 |
CATS_MACRO_R | 88.75 |
CATS_MACRO_F | 88.95 |
CATS_MACRO_AUC | 97.80 |
TEXTCAT_LOSS | 686.95 |
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Close
Hashes for en_prompt_nsfw_pipeline_bow-0.1.1.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 761bcb6d894fb25fcc469156a1b2bf801de96ecfc60cba6a80df534d1ed35254 |
|
MD5 | ba3a62fe8a6ecf14dee9d22fe2d52c0b |
|
BLAKE2b-256 | d740c444bfe437f8ffd4a5bb2184e53a44b86ed4d3878ddf7bdc5791d423a542 |