dverdu-freepik
commited on
Commit
Β·
f9eaa0b
1
Parent(s):
bcf3de6
fix: Update README.md
Browse files
README.md
CHANGED
@@ -15,21 +15,21 @@ tags:
|
|
15 |
|
16 |
# Flux.1 Lite
|
17 |
|
18 |
-
We
|
19 |
|
20 |
-
Our goal is to further reduce FLUX.1-dev transformer parameters
|
21 |
|
22 |
![Flux.1 Lite vs FLUX.1-dev](./sample_images/models_comparison.png)
|
23 |
|
24 |
## Motivation
|
25 |
|
26 |
-
As stated by other members of the community like [Ostris](https://ostris.com/2024/09/07/skipping-flux-1-dev-blocks/), it seems that each of the blocks of the Flux1.dev transformer is not contributing equally to the final image generation. To confirm this hypothesis, we can measure the MSE between the input and the output of each block. As we can see in the following images, the mse differs a lot between the different blocks.
|
27 |
|
28 |
![Flux.1 Lite generated image](./sample_images/skip_blocks/generated_img.png)
|
29 |
![MSE MMDIT](./sample_images/skip_blocks/mse_mmdit_img.png)
|
30 |
![MSE DIT](./sample_images/skip_blocks/mse_dit_img.png)
|
31 |
|
32 |
-
Furthermore, as displayed in the following image, only when you skip one of the first MMDIT blocks, the performance of the model
|
33 |
![Skip one MMDIT block](./sample_images/skip_blocks/skip_one_MMDIT_block.png)
|
34 |
![Skip one DIT block](./sample_images/skip_blocks/skip_one_DIT_block.png)
|
35 |
|
@@ -54,7 +54,7 @@ pipe = FluxPipeline.from_pretrained(
|
|
54 |
# Inference
|
55 |
prompt = "A close-up image of a green alien with fluorescent skin in the middle of a dark purple forest"
|
56 |
|
57 |
-
guidance_scale = 3.5 #
|
58 |
n_steps = 28
|
59 |
seed = 11
|
60 |
|
@@ -78,7 +78,7 @@ We also provided a ComfyUI workflow in `comfy/flux.1-lite_workflow.json`
|
|
78 |
* `flux.1-lite-8B-alpha.safetensors`: Transformer checkpoint, in Flux original format.
|
79 |
* `transformers/`: Contains distilled 8B transformer model, in diffusers format.
|
80 |
|
81 |
-
## π€ Hugging Face
|
82 |
Flux.1 Lite demo host on [π€ flux.1-lite](https://huggingface.co/spaces/Freepik/flux.1-lite)
|
83 |
|
84 |
## π₯ News π₯
|
|
|
15 |
|
16 |
# Flux.1 Lite
|
17 |
|
18 |
+
We are excited to announce the alpha version of our new distilled Flux.1 Lite model, an 8B parameter transformer model distilled from the original Flux1. dev model.
|
19 |
|
20 |
+
Our goal is to further reduce FLUX.1-dev transformer parameters to approximately 24 GB to make it compatible with most of GPU cards.
|
21 |
|
22 |
![Flux.1 Lite vs FLUX.1-dev](./sample_images/models_comparison.png)
|
23 |
|
24 |
## Motivation
|
25 |
|
26 |
+
As stated by other members of the community like [Ostris](https://ostris.com/2024/09/07/skipping-flux-1-dev-blocks/), it seems that each of the blocks of the Flux1.dev transformer is not contributing equally to the final image generation. To confirm this hypothesis, we can measure the Mean Squared Error (MSE) between the input and the output of each block. As we can see in the following images, the mse differs a lot between the different blocks.
|
27 |
|
28 |
![Flux.1 Lite generated image](./sample_images/skip_blocks/generated_img.png)
|
29 |
![MSE MMDIT](./sample_images/skip_blocks/mse_mmdit_img.png)
|
30 |
![MSE DIT](./sample_images/skip_blocks/mse_dit_img.png)
|
31 |
|
32 |
+
Furthermore, as displayed in the following image, only when you skip one of the first MMDIT blocks, the performance of the model severely impacts the model's performance.
|
33 |
![Skip one MMDIT block](./sample_images/skip_blocks/skip_one_MMDIT_block.png)
|
34 |
![Skip one DIT block](./sample_images/skip_blocks/skip_one_DIT_block.png)
|
35 |
|
|
|
54 |
# Inference
|
55 |
prompt = "A close-up image of a green alien with fluorescent skin in the middle of a dark purple forest"
|
56 |
|
57 |
+
guidance_scale = 3.5 # Keep guidance_scale at 3.5
|
58 |
n_steps = 28
|
59 |
seed = 11
|
60 |
|
|
|
78 |
* `flux.1-lite-8B-alpha.safetensors`: Transformer checkpoint, in Flux original format.
|
79 |
* `transformers/`: Contains distilled 8B transformer model, in diffusers format.
|
80 |
|
81 |
+
## π€ Hugging Face space:
|
82 |
Flux.1 Lite demo host on [π€ flux.1-lite](https://huggingface.co/spaces/Freepik/flux.1-lite)
|
83 |
|
84 |
## π₯ News π₯
|