Instructions to use lightx2v/Hy1.5-Quantized-Models with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Diffusers
How to use lightx2v/Hy1.5-Quantized-Models with Diffusers:
pip install -U diffusers transformers accelerate
import torch from diffusers import DiffusionPipeline from diffusers.utils import load_image, export_to_video # switch to "mps" for apple devices pipe = DiffusionPipeline.from_pretrained("lightx2v/Hy1.5-Quantized-Models", dtype=torch.bfloat16, device_map="cuda") pipe.to("cuda") prompt = "A man with short gray hair plays a red electric guitar." image = load_image( "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/guitar-man.png" ) output = pipe(image=image, prompt=prompt).frames[0] export_to_video(output, "output.mp4") - Diffusion Single File
How to use lightx2v/Hy1.5-Quantized-Models with Diffusion Single File:
# No code snippets available yet for this library. # To use this model, check the repository files and the library's documentation. # Want to help? PRs adding snippets are welcome at: # https://github.com/huggingface/huggingface.js
- Notebooks
- Google Colab
- Kaggle
comfyui会有4 步lora吗
期待一个
会有的
👍,再是comfyui版本还是不能用,很想用你们新的vae。hunyuan的瓶颈在vae解码,必须分块不说,还慢
vae目前还没适配comfyui,后续会适配。_comfyui结尾的fp8权重还不能用吗?
还是不能用,一样的黑屏,我试的hy15_480p_i2v_distilled_fp8_e4m3_lightx2v_comfyui ,sage2.另外https://github.com/ModelTC/ComfyUI-Lightx2vWrapper 这个项目能适配一下hy1.5吗
unet missing: ['time_in.in_layer.scale_weight', 'time_in.out_layer.scale_weight', 'txt_in.input_embedder.scale_weight', 'txt_in.t_embedder.in_layer.scale_weight', 'txt_in.t_embedder.out_layer.scale_weight', 'txt_in.c_embedder.in_layer.scale_weight', 'txt_in.c_embedder.out_layer.scale_weight', 'txt_in.individual_token_refiner.blocks.0.adaLN_modulation.1.scale_weight', 'txt_in.individual_token_refiner.blocks.0.self_attn.qkv.scale_weight', 'txt_in.individual_token_refiner.blocks.0.self_attn.proj.scale_weight', 'txt_in.individual_token_refiner.blocks.0.mlp.0.scale_weight', 'txt_in.individual_token_refiner.blocks.0.mlp.2.scale_weight', 'txt_in.individual_token_refiner.blocks.1.adaLN_modulation.1.scale_weight', 'txt_in.individual_token_refiner.blocks.1.self_attn.qkv.scale_weight', 'txt_in.individual_token_refiner.blocks.1.self_attn.proj.scale_weight', 'txt_in.individual_token_refiner.blocks.1.mlp.0.scale_weight', 'txt_in.individual_token_refiner.blocks.1.mlp.2.scale_weight', 'double_blocks.0.img_attn.qkv.weight', 'double_blocks.0.img_attn.qkv.bias', 'double_blocks.0.img_attn.qkv.scale_weight', 'double_blocks.0.txt_attn.qkv.weight', 'double_blocks.0.txt_attn.qkv.bias', 'double_blocks.0.txt_attn.qkv.scale_weight', 'double_blocks.1.img_attn.qkv.weight', 'double_blocks.1.img_attn.qkv.bias', 'double_blocks.1.img_attn.qkv.scale_weight', 'double_blocks.1.txt_attn.qkv.weight', 'double_blocks.1.txt_attn.qkv.bias', 'double_blocks.1.txt_attn.qkv.scale_weight', 'double_blocks.2.img_attn.qkv.weight', 'double_blocks.2.img_attn.qkv.bias', 'double_blocks.2.img_attn.qkv.scale_weight', 'double_blocks.2.txt_attn.qkv.weight', 'double_blocks.2.txt_attn.qkv.bias', 'double_blocks.2.txt_attn.qkv.scale_weight', 'double_blocks.3.img_attn.qkv.weight', 'double_blocks.3.img_attn.qkv.bias', 'double_blocks.3.img_attn.qkv.scale_weight', 'double_blocks.3.txt_attn.qkv.weight', 'double_blocks.3.txt_attn.qkv.bias', 'double_blocks.3.txt_attn.qkv.scale_weight', 'double_blocks.4.img_attn.qkv.weight', 'double_blocks.4.img_attn.qkv.bias', 'double_blocks.4.img_attn.qkv.scale_weight', 'double_blocks.4.txt_attn.qkv.weight', 'double_blocks.4.txt_attn.qkv.bias', 'double_blocks.4.txt_attn.qkv.scale_weight', 'double_blocks.5.img_attn.qkv.weight', 'double_blocks.5.img_attn.qkv.bias', 'double_blocks.5.img_attn.qkv.scale_weight', 'double_blocks.5.txt_attn.qkv.weight', 'double_blocks.5.txt_attn.qkv.bias', 'double_blocks.5.txt_attn.qkv.scale_weight', 'double_blocks.6.img_attn.qkv.weight', 'double_blocks.6.img_attn.qkv.bias', 'double_blocks.6.img_attn.qkv.scale_weight', 'double_blocks.6.txt_attn.qkv.weight', 'double_blocks.6.txt_attn.qkv.bias', 'double_blocks.6.txt_attn.qkv.scale_weight', 'double_blocks.7.img_attn.qkv.weight', 'double_blocks.7.img_attn.qkv.bias', 'double_blocks.7.img_attn.qkv.scale_weight', 'double_blocks.7.txt_attn.qkv.weight', 'double_blocks.7.txt_attn.qkv.bias', 'double_blocks.7.txt_attn.qkv.scale_weight', 'double_blocks.8.img_attn.qkv.weight', 'double_blocks.8.img_attn.qkv.bias', 'double_blocks.8.img_attn.qkv.scale_weight', 'double_blocks.8.txt_attn.qkv.weight', 'double_blocks.8.txt_attn.qkv.bias', 'double_blocks.8.txt_attn.qkv.scale_weight', 'double_blocks.9.img_attn.qkv.weight', 'double_blocks.9.img_attn.qkv.bias', 'double_blocks.9.img_attn.qkv.scale_weight', 'double_blocks.9.txt_attn.qkv.weight', 'double_blocks.9.txt_attn.qkv.bias', 'double_blocks.9.txt_attn.qkv.scale_weight', 'double_blocks.10.img_attn.qkv.weight', 'double_blocks.10.img_attn.qkv.bias', 'double_blocks.10.img_attn.qkv.scale_weight', 'double_blocks.10.txt_attn.qkv.weight', 'double_blocks.10.txt_attn.qkv.bias', 'double_blocks.10.txt_attn.qkv.scale_weight', 'double_blocks.11.img_attn.qkv.weight', 'double_blocks.11.img_attn.qkv.bias', 'double_blocks.11.img_attn.qkv.scale_weight', 'double_blocks.11.txt_attn.qkv.weight', 'double_blocks.11.txt_attn.qkv.bias', 'double_blocks.11.txt_attn.qkv.scale_weight', 'double_blocks.12.img_attn.qkv.weight', 'double_blocks.12.img_attn.qkv.bias', 'double_blocks.12.img_attn.qkv.scale_weight', 'double_blocks.12.txt_attn.qkv.weight', 'double_blocks.12.txt_attn.qkv.bias', 'double_blocks.12.txt_attn.qkv.scale_weight', 'double_blocks.13.img_attn.qkv.weight', 'double_blocks.13.img_attn.qkv.bias', 'double_blocks.13.img_attn.qkv.scale_weight', 'double_blocks.13.txt_attn.qkv.weight', 'double_blocks.13.txt_attn.qkv.bias', 'double_blocks.13.txt_attn.qkv.scale_weight', 'double_blocks.14.img_attn.qkv.weight', 'double_blocks.14.img_attn.qkv.bias', 'double_blocks.14.img_attn.qkv.scale_weight', 'double_blocks.14.txt_attn.qkv.weight', 'double_blocks.14.txt_attn.qkv.bias', 'double_blocks.14.txt_attn.qkv.scale_weight', 'double_blocks.15.img_attn.qkv.weight', 'double_blocks.15.img_attn.qkv.bias', 'double_blocks.15.img_attn.qkv.scale_weight', 'double_blocks.15.txt_attn.qkv.weight', 'double_blocks.15.txt_attn.qkv.bias', 'double_blocks.15.txt_attn.qkv.scale_weight', 'double_blocks.16.img_attn.qkv.weight', 'double_blocks.16.img_attn.qkv.bias', 'double_blocks.16.img_attn.qkv.scale_weight', 'double_blocks.16.txt_attn.qkv.weight', 'double_blocks.16.txt_attn.qkv.bias', 'double_blocks.16.txt_attn.qkv.scale_weight', 'double_blocks.17.img_attn.qkv.weight', 'double_blocks.17.img_attn.qkv.bias', 'double_blocks.17.img_attn.qkv.scale_weight', 'double_blocks.17.txt_attn.qkv.weight', 'double_blocks.17.txt_attn.qkv.bias', 'double_blocks.17.txt_attn.qkv.scale_weight', 'double_blocks.18.img_attn.qkv.weight', 'double_blocks.18.img_attn.qkv.bias', 'double_blocks.18.img_attn.qkv.scale_weight', 'double_blocks.18.txt_attn.qkv.weight', 'double_blocks.18.txt_attn.qkv.bias', 'double_blocks.18.txt_attn.qkv.scale_weight', 'double_blocks.19.img_attn.qkv.weight', 'double_blocks.19.img_attn.qkv.bias', 'double_blocks.19.img_attn.qkv.scale_weight', 'double_blocks.19.txt_attn.qkv.weight', 'double_blocks.19.txt_attn.qkv.bias', 'double_blocks.19.txt_attn.qkv.scale_weight', 'double_blocks.20.img_attn.qkv.weight', 'double_blocks.20.img_attn.qkv.bias', 'double_blocks.20.img_attn.qkv.scale_weight', 'double_blocks.20.txt_attn.qkv.weight', 'double_blocks.20.txt_attn.qkv.bias', 'double_blocks.20.txt_attn.qkv.scale_weight', 'double_blocks.21.img_attn.qkv.weight', 'double_blocks.21.img_attn.qkv.bias', 'double_blocks.21.img_attn.qkv.scale_weight', 'double_blocks.21.txt_attn.qkv.weight', 'double_blocks.21.txt_attn.qkv.bias', 'double_blocks.21.txt_attn.qkv.scale_weight', 'double_blocks.22.img_attn.qkv.weight', 'double_blocks.22.img_attn.qkv.bias', 'dou
kj 提出了一个t2v Lora,放到i2v里面能跑但是出现很明显的变色 https://huggingface.co/Comfy-Org/HunyuanVideo_1.5_repackaged