- 4月 07, 2024
-
-
由 Kohya S 创作于
-
由 Kohya S 创作于
-
由 Kohya S 创作于
-
由 kabachuha 创作于
Add option to use Scheduled Huber Loss in all training pipelines to improve resilience to data corruption (#1228) * add huber loss and huber_c compute to train_util * add reduction modes * add huber_c retrieval from timestep getter * move get timesteps and huber to own function * add conditional loss to all training scripts * add cond loss to train network * add (scheduled) huber_loss to args * fixup twice timesteps getting * PHL-schedule should depend on noise scheduler's num timesteps * *2 multiplier to huber loss cause of 1/2 a^2 conv. The Taylor expansion of sqrt near zero gives 1/2 a^2, which differs from a^2 of the standard MSE loss. This change scales them better against one another * add option for smooth l1 (huber / delta) * unify huber scheduling * add snr huber scheduler --------- Co-authored-by: Kohya S <52813779+kohya-ss@users.noreply.github.com>
-
由 Kohya S 创作于
- 4月 05, 2024
-
-
由 ykume 创作于
-
- 4月 03, 2024
-
-
由 Kohya S 创作于
-
- 3月 31, 2024
-
-
由 Kohya S 创作于
-
- 3月 30, 2024
- 3月 29, 2024
- 3月 28, 2024
- 3月 27, 2024
-
-
由 Disty0 创作于
- 3月 26, 2024
- 3月 25, 2024
- 3月 24, 2024
-
-
由 Kohya S 创作于
-
由 Kohya S 创作于
-
由 Kohya S 创作于
-
由 Kohya S 创作于
-
由 Kohya S 创作于
-
由 Kohya S 创作于
-
由 Kohaku-Blueleaf 创作于
[Experimental] Add cache mechanism for dataset groups to avoid long waiting time for initilization (#1178) * support meta cached dataset * add cache meta scripts * random ip_noise_gamma strength * random noise_offset strength * use correct settings for parser * cache path/caption/size only * revert mess up commit * revert mess up commit * Update requirements.txt * Add arguments for meta cache. * remove pickle implementation * Return sizes when enable cache --------- Co-authored-by: Kohya S <52813779+kohya-ss@users.noreply.github.com>
-