Skip to content

Releases: ljleb/sd-mecha

0.0.26

06 Sep 16:21
ef4e793
Compare
Choose a tag to compare
0.0.26 Pre-release
Pre-release
  • allow to merge in delta space by passing strict_weight_space=False in RecipeMerger.merge_and_save()
  • handle unexpected float types when expecting an integer in some merge methods
  • replace the parameter no_rescale of the method ties_sum_with_dropout with rescale and use a differentiable implementation
  • use torch.Generator instead of torch.manual_seed
  • revert the use of torch.svd_lowrank in rotate when alignment is fractional, otherwise it results in nans during inference

0.0.25

03 Aug 06:08
Compare
Choose a tag to compare
0.0.25 Pre-release
Pre-release
  • fix a bug where the model configurations would silently fail to parse
  • fix none seed not supported as a default hyper value

0.0.24

03 Aug 03:48
bdb671d
Compare
Choose a tag to compare
0.0.24 Pre-release
Pre-release
  • remove CWM (sd_mecha.hypers.classes)

0.0.23

02 Aug 17:53
Compare
Choose a tag to compare
0.0.23 Pre-release
Pre-release
  • speedup rotate method by ~2x using torch.svd_lowrank

0.0.21

30 Jul 07:36
Compare
Choose a tag to compare
0.0.21 Pre-release
Pre-release
  • add new builtin methods: n_average, geometric_median, ties_sum_extended / add_difference_ties_extended, ties_sum_with_dropout / ties_with_dare, model_stock_for_tensor / model_stock_n_models
  • add a new parameter vote_sgn to ties_sum / add_differenc_ties

credits to @6DammK9 for this contribution!

0.0.20

07 Jul 03:05
Compare
Choose a tag to compare
0.0.20 Pre-release
Pre-release
  • revert sd_mecha.train_difference to the original implementation from supermerger
  • add 3 new methods add_opposite, clamped_add_opposite and select_max_delta

Note that clamped_add_opposite corresponds to the implementation of train difference in 0.0.19 and earlier versions.

0.0.17

14 Jun 15:54
Compare
Choose a tag to compare
0.0.17 Pre-release
Pre-release
  • rename method clip to clamp to disambiguate it from "CLIP" the text encoder
  • I originally mislabelled the tag as 0.17 and I am too lazy to fix the commit target after creating the new 0.0.17 tag (the github ui doesn't allow me to enter a commit hash) so here is the appropriate commit hash of the release: f8b0800

0.0.19

02 Jul 20:14
Compare
Choose a tag to compare
0.0.19 Pre-release
Pre-release
  • raise error for non-lora keys when loading a lora model type in #33
  • fix a bug where keys of input models where directly used as fallback when applying a lora to a composite recipe in #34

0.0.18

18 Jun 20:05
Compare
Choose a tag to compare
0.0.18 Pre-release
Pre-release
  • change default value of parameter alpha in add difference from 0.5 to 1.0

0.0.16

12 Jun 18:35
Compare
Choose a tag to compare
0.0.16 Pre-release
Pre-release
  • add module t5xxl to sd3 architecture