A prompting enhancement library for transformers-type text embedding systems.
Project description
Compel
A text prompt weighting and blending library for transformers-type text embedding systems, by @damian0815.
With a flexible and intuitive syntax, you can re-weight different parts of a prompt string and thus re-weight the different parts of the embeddning tensor produced from the string.
Tested and developed against Hugging Face's StableDiffusionPipeline
but it should work with any diffusers-based system that uses an Tokenizer
and a Text Encoder
of some kind.
Adapted from the InvokeAI prompting code (also by @damian0815). For now, the syntax is fully documented here - note however that cross-attention control .swap()
is currently ignored by Compel.
Installation
pip install compel
Demo
Quickstart
with Hugging Face diffusers >=0.12:
from diffusers import StableDiffusionPipeline
from compel import Compel
pipeline = StableDiffusionPipeline.from_pretrained("runwayml/stable-diffusion-v1-5")
compel = Compel(tokenizer=pipeline.tokenizer, text_encoder=pipeline.text_encoder)
# upweight "ball"
prompt = "a cat playing with a ball++ in the forest"
conditioning = compel.build_conditioning_tensor(prompt)
# generate image
images = pipeline(prompt_embeds=conditioning, num_inference_steps=20).images
images[0].save("image.jpg")
For batched input, use torch.cat
to merge multiple conditioning tensors into one:
import torch
from diffusers import StableDiffusionPipeline
from compel import Compel
pipeline = StableDiffusionPipeline.from_pretrained("runwayml/stable-diffusion-v1-5")
compel = Compel(tokenizer=pipeline.tokenizer, text_encoder=pipeline.text_encoder)
prompts = ["a cat playing with a ball++ in the forest", "a dog playing with a ball in the forest"]
prompt_embeds = torch.cat([compel.build_conditioning_tensor(prompt) for prompt in prompts])
images = pipeline(prompt_embeds=prompt_embeds).images
images[0].save("image0.jpg")
images[1].save("image1.jpg")
Changelog
0.1.9 - add support for prompts longer than the model's max token length.
To enable, initialize Compel
with truncate_long_prompts=False
(default is True). Prompts that are longer than the model's max_token_length
will be chunked and padded out to an integer multiple of max_token_length
.
If you're working with a negative prompt, you will probably need to use compel.pad_conditioning_tensors_to_same_length()
to avoid having the model complain about mismatched conditioning tensor lengths:
compel = Compel(..., truncate_long_prompts=False)
prompt = "a cat playing with a ball++ in the forest, amazing, exquisite, stunning, masterpiece, skilled, powerful, incredible, amazing, trending on gregstation, greg, greggy, greggs greggson, greggy mcgregface, ..." # very long prompt
negative_prompt = "dog, football, rainforest" # short prompt
conditioning = compel.build_conditioning_tensor(prompt)
negative_conditioning = compel.build_conditioning_tensor(negative_prompt)
[conditioning, negative_conditioning] = compel.pad_conditioning_tensors_to_same_length([conditioning, negative_conditioning])
0.1.8 - downgrade Python min version to 3.7
0.1.7 - InvokeAI compatibility
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file compel-0.1.10.tar.gz
.
File metadata
- Download URL: compel-0.1.10.tar.gz
- Upload date:
- Size: 26.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.7.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a53bb592a0b910e3743a78881c4abba0204bf6851eb9fcb5b1d53360f41ac507 |
|
MD5 | 848a52ace65b5500ccf183feacd4c2cb |
|
BLAKE2b-256 | 12a872690e29e0607f44b7608da0ea1c18f3d91a5b54aca9b4c0facd663446a5 |
File details
Details for the file compel-0.1.10-py3-none-any.whl
.
File metadata
- Download URL: compel-0.1.10-py3-none-any.whl
- Upload date:
- Size: 20.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.7.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d604062f6ecd3abac72e3e187738dde14e21cc8b55e3916deecd214447f98f03 |
|
MD5 | e0d99ffd5b13f91e55f5a391b1a1cc77 |
|
BLAKE2b-256 | 818c537938e45c9cd5d08e7052d4b39bca3c674649711d54065b5be887ea0e3a |