Skip to main content

NovelAI Python Binding With Pydantic

Project description

banner


PyPI version Downloads

✨ NovelAI api python sdk with Pydantic, modern and user-friendly.

The goal of this repository is to use Pydantic to build legitimate requests to access the NovelAI API service.

Python >= 3.9 is required.

Roadmap 🚧

  • tool.random_prompt
  • tool.paint_mask
  • tool.image_metadata
  • tokenizer
  • /ai/generate-image
  • /user/subscription
  • /user/login
  • /user/information
  • /ai/upscale
  • /ai/generate-image/suggest-tags
  • /ai/generate-voice
  • /ai/generate-stream
  • /ai/generate
  • /ai/augment-image
  • /ai/annotate-image
  • /ai/classify
  • /ai/generate-prompt

GenerateImageInfer.calculate_cost is correct in most cases, but please request account information to get accurate consumption information.

This repo is maintained by me personally now. If you have any questions, please feel free to open an issue.

Usage 🖥️

pip -U install novelai-python

More examples can be found in the playground directory, read code as documentation.

import asyncio
import os

from dotenv import load_dotenv
from pydantic import SecretStr

from novelai_python import GenerateImageInfer, ImageGenerateResp, ApiCredential

load_dotenv()
enhance = "year 2023,dynamic angle,  best quality, amazing quality, very aesthetic, absurdres"
session = ApiCredential(api_token=SecretStr(os.getenv("NOVELAI_JWT")))  # pst-***


async def main():
    gen = await GenerateImageInfer.build(prompt=f"1girl,{enhance}")
    cost = gen.calculate_cost(is_opus=True)
    print(f"charge: {cost} if you are vip3")
    resp = gen.request(session=session)
    resp: ImageGenerateResp
    print(resp.meta)
    file = resp.files[0]
    with open(file[0], "wb") as f:
        f.write(file[1])


loop = asyncio.get_event_loop()
loop.run_until_complete(main())

LLM

import asyncio
import os

from dotenv import load_dotenv
from pydantic import SecretStr

from novelai_python import APIError, LoginCredential
from novelai_python.sdk.ai.generate import TextLLMModel, LLM, get_default_preset, AdvanceLLMSetting
from novelai_python.sdk.ai.generate._enum import get_model_preset

load_dotenv()
username = os.getenv("NOVELAI_USER", None)
assert username is not None
# credential = JwtCredential(jwt_token=SecretStr(jwt))
login_credential = LoginCredential(
    username=os.getenv("NOVELAI_USER"),
    password=SecretStr(os.getenv("NOVELAI_PASS"))
)


async def chat(prompt: str):
    try:
        model = TextLLMModel.ERATO  # llama3
        parameters = get_default_preset(model).parameters
        agent = LLM.build(
            prompt=prompt,
            model=model,
            # parameters=None,  # Auto Select or get from preset
            parameters=get_model_preset(TextLLMModel.ERATO).get_all_presets()[0].parameters,  # Select from enum preset
            advanced_setting=AdvanceLLMSetting(
                min_length=1,
                max_length=None,  # Auto
            )
        )
        # NOTE:parameter > advanced_setting, which logic in generate/__init__.py
        # If you not pass the parameter, it will use the default preset.
        # So if you want to set the generation params, you should pass your own params.
        # Only if you want to use some params not affect the generation, you can use advanced_setting.
        result = await agent.request(session=login_credential)
    except APIError as e:
        raise Exception(f"Error: {e.message}")
    print(f"Result: \n{result.text}")


loop = asyncio.get_event_loop()
loop.run_until_complete(chat("Hello"))

Random Prompt

from novelai_python.tool.random_prompt import RandomPromptGenerator

prompt = RandomPromptGenerator(nsfw_enabled=False).random_prompt()
print(prompt)

Run A Server

pip install novelai_python
python3 -m novelai_python.server -h '127.0.0.1' -p 7888

Tokenizer

from novelai_python._enum import get_tokenizer_model, TextLLMModel
from novelai_python.tokenizer import NaiTokenizer

tokenizer_package = NaiTokenizer(get_tokenizer_model(TextLLMModel.ERATO))
t_text = "a fox jumped over the lazy dog"
encode_tokens = tokenizer_package.encode(t_text)
print(tokenizer_package.tokenize_text(t_text))
print(f"Tokenized text: {encode_tokens}")
print(tokenizer_package.decode(tokenizer_package.encode(t_text)))

About Nsfw 🚫

You might need some solutions for identifying NSFW content and adding a mosaic to prevent operational mishaps.

https://dghs-imgutils.deepghs.org/main/api_doc/detect/nudenet.html

https://dghs-imgutils.deepghs.org/main/api_doc/operate/censor.html

Acknowledgements 🙏

BackEnd

novelai-api

NovelAI-API

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

novelai_python-0.5.3.tar.gz (2.0 MB view details)

Uploaded Source

Built Distribution

novelai_python-0.5.3-py3-none-any.whl (2.0 MB view details)

Uploaded Python 3

File details

Details for the file novelai_python-0.5.3.tar.gz.

File metadata

  • Download URL: novelai_python-0.5.3.tar.gz
  • Upload date:
  • Size: 2.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: pdm/2.19.1 CPython/3.10.12 Linux/6.8.0-1014-azure

File hashes

Hashes for novelai_python-0.5.3.tar.gz
Algorithm Hash digest
SHA256 5eebf4c84e633132ee0e5507a4ab55e171fddf47c4da46957563c171605fb29a
MD5 b8e839f891677a84bc0a551e6d6935cd
BLAKE2b-256 70685b97b57f7f795cd48d041ec1ed28565b8e1df96e3004e87ff64b5cd7c1a1

See more details on using hashes here.

File details

Details for the file novelai_python-0.5.3-py3-none-any.whl.

File metadata

  • Download URL: novelai_python-0.5.3-py3-none-any.whl
  • Upload date:
  • Size: 2.0 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: pdm/2.19.1 CPython/3.10.12 Linux/6.8.0-1014-azure

File hashes

Hashes for novelai_python-0.5.3-py3-none-any.whl
Algorithm Hash digest
SHA256 a169df156f52f87ac88bf28349ecb82b236d306cbeb0c7726e4488e4919e7842
MD5 734c9086913560fc69ab22cb64d74034
BLAKE2b-256 03b314fee91ec9047078051bd0316e89e6aa4fdb871124ee22cac5f9109da01a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page