Skip to main content

NovelAI Python Binding With Pydantic

Project description

banner


PyPI version Downloads

✨ NovelAI api python sdk with Pydantic, modern and user-friendly.

The goal of this repository is to use Pydantic to build legitimate requests to access the NovelAI API service.

Python >= 3.9 is required.

Roadmap 🚧

  • tool.random_prompt
  • tool.paint_mask
  • tool.image_metadata
  • tokenizer
  • /ai/generate-image
  • /user/subscription
  • /user/login
  • /user/information
  • /ai/upscale
  • /ai/generate-image/suggest-tags
  • /ai/generate-voice
  • /ai/generate-stream
  • /ai/generate
  • /ai/augment-image
  • /ai/annotate-image
  • /ai/classify
  • /ai/generate-prompt

GenerateImageInfer.calculate_cost is correct in most cases, but please request account information to get accurate consumption information.

This repo is maintained by me personally now. If you have any questions, please feel free to open an issue.

Usage 🖥️

pip -U install novelai-python

More examples can be found in the playground directory, read code as documentation.

import asyncio
import os

from dotenv import load_dotenv
from pydantic import SecretStr

from novelai_python import GenerateImageInfer, ImageGenerateResp, ApiCredential

load_dotenv()
enhance = "year 2023,dynamic angle,  best quality, amazing quality, very aesthetic, absurdres"
session = ApiCredential(api_token=SecretStr(os.getenv("NOVELAI_JWT")))  # pst-***


async def main():
    gen = await GenerateImageInfer.build(prompt=f"1girl,{enhance}")
    cost = gen.calculate_cost(is_opus=True)
    print(f"charge: {cost} if you are vip3")
    resp = gen.request(session=session)
    resp: ImageGenerateResp
    print(resp.meta)
    file = resp.files[0]
    with open(file[0], "wb") as f:
        f.write(file[1])


loop = asyncio.get_event_loop()
loop.run_until_complete(main())

LLM

import asyncio
import os

from dotenv import load_dotenv
from pydantic import SecretStr

from novelai_python import APIError, LoginCredential
from novelai_python.sdk.ai.generate import TextLLMModel, LLM, get_default_preset, AdvanceLLMSetting
from novelai_python.sdk.ai.generate._enum import get_model_preset

load_dotenv()
username = os.getenv("NOVELAI_USER", None)
assert username is not None
# credential = JwtCredential(jwt_token=SecretStr(jwt))
login_credential = LoginCredential(
    username=os.getenv("NOVELAI_USER"),
    password=SecretStr(os.getenv("NOVELAI_PASS"))
)


async def chat(prompt: str):
    try:
        model = TextLLMModel.ERATO  # llama3
        parameters = get_default_preset(model).parameters
        agent = LLM.build(
            prompt=prompt,
            model=model,
            # parameters=None,  # Auto Select or get from preset
            parameters=get_model_preset(TextLLMModel.ERATO).get_all_presets()[0].parameters,  # Select from enum preset
            advanced_setting=AdvanceLLMSetting(
                min_length=1,
                max_length=None,  # Auto
            )
        )
        # NOTE:parameter > advanced_setting, which logic in generate/__init__.py
        # If you not pass the parameter, it will use the default preset.
        # So if you want to set the generation params, you should pass your own params.
        # Only if you want to use some params not affect the generation, you can use advanced_setting.
        result = await agent.request(session=login_credential)
    except APIError as e:
        raise Exception(f"Error: {e.message}")
    print(f"Result: \n{result.text}")


loop = asyncio.get_event_loop()
loop.run_until_complete(chat("Hello"))

Random Prompt

from novelai_python.tool.random_prompt import RandomPromptGenerator

prompt = RandomPromptGenerator(nsfw_enabled=False).random_prompt()
print(prompt)

Run A Server

pip install novelai_python
python3 -m novelai_python.server -h '127.0.0.1' -p 7888

Tokenizer

from novelai_python._enum import get_tokenizer_model, TextLLMModel
from novelai_python.tokenizer import NaiTokenizer

tokenizer_package = NaiTokenizer(get_tokenizer_model(TextLLMModel.ERATO))
t_text = "a fox jumped over the lazy dog"
encode_tokens = tokenizer_package.encode(t_text)
print(tokenizer_package.tokenize_text(t_text))
print(f"Tokenized text: {encode_tokens}")
print(tokenizer_package.decode(tokenizer_package.encode(t_text)))

About Nsfw 🚫

You might need some solutions for identifying NSFW content and adding a mosaic to prevent operational mishaps.

https://dghs-imgutils.deepghs.org/main/api_doc/detect/nudenet.html

https://dghs-imgutils.deepghs.org/main/api_doc/operate/censor.html

Acknowledgements 🙏

BackEnd

novelai-api

NovelAI-API

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

novelai_python-0.5.2.tar.gz (2.0 MB view details)

Uploaded Source

Built Distribution

novelai_python-0.5.2-py3-none-any.whl (2.0 MB view details)

Uploaded Python 3

File details

Details for the file novelai_python-0.5.2.tar.gz.

File metadata

  • Download URL: novelai_python-0.5.2.tar.gz
  • Upload date:
  • Size: 2.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: pdm/2.19.1 CPython/3.10.12 Linux/6.8.0-1014-azure

File hashes

Hashes for novelai_python-0.5.2.tar.gz
Algorithm Hash digest
SHA256 d421ac57547cd2b4daa2bf40e59f3e1ffe79b770a014ab6d3d17f97c8471790b
MD5 3019dafda7655f778b93b1049dd79109
BLAKE2b-256 26604461e455a9af4779c2d0de27308b6f2d6fc7902dad17bec8e4137c1ed946

See more details on using hashes here.

File details

Details for the file novelai_python-0.5.2-py3-none-any.whl.

File metadata

  • Download URL: novelai_python-0.5.2-py3-none-any.whl
  • Upload date:
  • Size: 2.0 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: pdm/2.19.1 CPython/3.10.12 Linux/6.8.0-1014-azure

File hashes

Hashes for novelai_python-0.5.2-py3-none-any.whl
Algorithm Hash digest
SHA256 7fae5f8084fd040ee7373da9cb0c01be0ad64afc7a24c0b1cec1daf2921c5db4
MD5 732152dab58a0476dcf73ef5fb921952
BLAKE2b-256 ef7dbd5d60a1b4cb0d9450a01c38c67da5b2ee7778ccd00d3824fdfc21130df5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page