NSFW pipeline that classifies prompt, using a bag-of-words model
Project description
NSFW pipeline that classifies prompt, using a bag-of-words model
Feature | Description |
— | — |
Name | en_prompt_nsfw_pipeline_bow |
Version | 0.1.0 |
spaCy | >=2.0.0,<3.0.0 |
Default Pipeline | textcat |
Components | textcat |
Vectors | 0 keys, 0 unique vectors (0 dimensions) |
Sources | n/a |
License | UNLICENSED |
Author | [Jiayu Liu]() |
### Label Scheme
<details>
<summary>View label scheme (4 labels for 1 components)</summary>
Component | Labels |
— | — |
`textcat` | adult, cp, underage_safe, safe |
</details>
### Accuracy
Type | Score |
— | — |
CATS_SCORE | 88.95 |
CATS_MICRO_P | 89.02 |
CATS_MICRO_R | 89.02 |
CATS_MICRO_F | 89.02 |
CATS_MACRO_P | 89.27 |
CATS_MACRO_R | 88.75 |
CATS_MACRO_F | 88.95 |
CATS_MACRO_AUC | 97.80 |
TEXTCAT_LOSS | 686.95 |
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Close
Hashes for en_prompt_nsfw_pipeline_bow-0.1.0.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5e336649e4a5d9d0365653b4c3edef6fbd3e0e89259aeb1a942432ae97a6c60a |
|
MD5 | 02fa425804e4002840e3f7c4a4893692 |
|
BLAKE2b-256 | e279e966cdc9385390a36a4abbb50f34b8d2476380ad907b503bd0b1409ecaad |