[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/vt/ - Virtual Youtubers

Search:


View post   

>> No.45726097 [SPOILER]  [View]
File: 480 KB, 1400x4000, catbox_ckl5ta.jpg [View same] [iqdb] [saucenao] [google]
45726097

Here it is. The merged bbw models. Pic related is a comparison between all of the combinations I made using a prompt that some might use with this model. VAE is the Anything v3.0 one. I think going with a more saturated VAE is probably a better choice. Kuki needs more work, the model I merged with needs less aggressive weighting on certain blocks of the unet so that bbw-step isn't diminished too much.

https://gofile.io/d/AzRqHS
(All of the models here in the above link need to be pruned. )


Some general notes regarding the model:
The model is based on Anything V4.5, which means that this should be compatible with most LoRAs and embeddings. If you want to get backgrounds that are even more detailed, using the token "detailed background" will activate dpep3 much more aggressively. As for negative prompts, go for whatever you would use for an Anything v4.5 or AOM2 based model.
Some notes regarding the Cocoa and Tea Merges:
These two have some oddities introduced from Cocoa which causes the masterpiece keyword to create some weird results, although not consistently, so experiment with it in your prompt and with it not in your prompt. I have had good results with and without it. Additionally, make sure your clip skip is set to 2, otherwise you might encounter some oddities.


Recipe for the merged model
First, merge dpep3-chillout into bbw_step_30000k (when I say this, you should make bbw_step model A, and dpep model B)
(find it here: https://huggingface.co/closertodeath/ctdmixes/blob/main/dpep3-chillout.safetensors )
with the following Merge Block Weights (you cannot use the default checkpoint merger, use supermerger, or another extension that can do MBW)
1,0.9,0.7,0.5,0.3,0.1,0.4,0.4,1,1,1,1,0,0,0,0,0,0,0,0.1,0.3,0.5,0.7,0.9,1
(You can start from the half version in the above link if you want, the fp16 and normal version did not have any hash differences)
Then merge the previous model you just made, I would call it bbw-dpep3-chillout, with Cocoa at the following block weights:
0.15,0.15,0.15,0.15,0.15,0.31,0.3,0.3,0.15,0.15,0.15,0.15,0.15,0.15,0.15,0.15,0.15,0.15,0.15,0.15,0.15,0.15,0.15,0.15,0.15
(NOTE: FOR ANY MERGE INVOLVING SPECIFICALLY THIS MERGED MODEL, ONLY DO MERGES WITH THE FULL SIZED MODEL AS THE HALF SIZED ONE HAS A DIFFERENT HASH FROM THE FULL SIZED ONE)
Then merge the previous model you just made, I would call it bbw-dp3c-Cocoa, with Tea at the following block weights:
0.35,0,0,0,0,0,0.1,0.1,0.1,0,0,0,0,0,0,0,0,0,0,0,0,0.35,0.35,0.35,0.35

(You can find Cocoa and Tea here: https://huggingface.co/andite/desserts/tree/main Cocoa is most likely a mix of YohanMix and Blossom-extract, but I have very little information on Tea )

Anyways, let me know what you think, I feel like the merge managed to really fix a lot of the issues the base model had. My favorite is either cocoa or tea, I haven't done enough prompting with either to really come to a conclusion. I'm probably also going to experiment with merging HLL at some point as well.

Navigation
View posts[+24][+48][+96]