Skip to content
GitLab
菜单
为什么选择 GitLab
定价
联系销售
探索
为什么选择 GitLab
定价
联系销售
探索
登录
获取免费试用
主导航
搜索或转到…
项目
S
Stable Diffusion Webui
管理
动态
成员
代码
仓库
分支
提交
标签
仓库图
比较修订版本
部署
模型注册表
分析
模型实验
帮助
帮助
支持
GitLab 文档
比较 GitLab 各版本
社区论坛
为极狐GitLab 提交贡献
提交反馈
隐私声明
快捷键
?
新增功能
4
代码片段
群组
项目
Show more breadcrumbs
Hunter0726
Stable Diffusion Webui
提交
87cd07b3
提交
87cd07b3
编辑于
1年前
作者:
Nuullll
浏览文件
操作
下载
补丁
差异文件
Fix fp64
上级
7499148a
No related branches found
No related tags found
无相关合并请求
变更
2
隐藏空白变更内容
行内
左右并排
显示
2 个更改的文件
modules/sd_samplers_timesteps_impl.py
+2
-2
2 个添加, 2 个删除
modules/sd_samplers_timesteps_impl.py
modules/xpu_specific.py
+1
-1
1 个添加, 1 个删除
modules/xpu_specific.py
有
3 个添加
和
3 个删除
modules/sd_samplers_timesteps_impl.py
+
2
−
2
浏览文件 @
87cd07b3
...
...
@@ -11,7 +11,7 @@ from modules.models.diffusion.uni_pc import uni_pc
def
ddim
(
model
,
x
,
timesteps
,
extra_args
=
None
,
callback
=
None
,
disable
=
None
,
eta
=
0.0
):
alphas_cumprod
=
model
.
inner_model
.
inner_model
.
alphas_cumprod
alphas
=
alphas_cumprod
[
timesteps
]
alphas_prev
=
alphas_cumprod
[
torch
.
nn
.
functional
.
pad
(
timesteps
[:
-
1
],
pad
=
(
1
,
0
))].
to
(
torch
.
float64
if
x
.
device
.
type
!=
'
mps
'
else
torch
.
float32
)
alphas_prev
=
alphas_cumprod
[
torch
.
nn
.
functional
.
pad
(
timesteps
[:
-
1
],
pad
=
(
1
,
0
))].
to
(
torch
.
float64
if
x
.
device
.
type
!=
'
mps
'
and
x
.
device
.
type
!=
'
xpu
'
else
torch
.
float32
)
sqrt_one_minus_alphas
=
torch
.
sqrt
(
1
-
alphas
)
sigmas
=
eta
*
np
.
sqrt
((
1
-
alphas_prev
.
cpu
().
numpy
())
/
(
1
-
alphas
.
cpu
())
*
(
1
-
alphas
.
cpu
()
/
alphas_prev
.
cpu
().
numpy
()))
...
...
@@ -43,7 +43,7 @@ def ddim(model, x, timesteps, extra_args=None, callback=None, disable=None, eta=
def
plms
(
model
,
x
,
timesteps
,
extra_args
=
None
,
callback
=
None
,
disable
=
None
):
alphas_cumprod
=
model
.
inner_model
.
inner_model
.
alphas_cumprod
alphas
=
alphas_cumprod
[
timesteps
]
alphas_prev
=
alphas_cumprod
[
torch
.
nn
.
functional
.
pad
(
timesteps
[:
-
1
],
pad
=
(
1
,
0
))].
to
(
torch
.
float64
if
x
.
device
.
type
!=
'
mps
'
else
torch
.
float32
)
alphas_prev
=
alphas_cumprod
[
torch
.
nn
.
functional
.
pad
(
timesteps
[:
-
1
],
pad
=
(
1
,
0
))].
to
(
torch
.
float64
if
x
.
device
.
type
!=
'
mps
'
and
x
.
device
.
type
!=
'
xpu
'
else
torch
.
float32
)
sqrt_one_minus_alphas
=
torch
.
sqrt
(
1
-
alphas
)
extra_args
=
{}
if
extra_args
is
None
else
extra_args
...
...
This diff is collapsed.
点击以展开。
modules/xpu_specific.py
+
1
−
1
浏览文件 @
87cd07b3
...
...
@@ -4,7 +4,7 @@ from modules.sd_hijack_utils import CondFunc
has_ipex
=
False
try
:
import
torch
import
intel_extension_for_pytorch
as
ipex
import
intel_extension_for_pytorch
as
ipex
# noqa: F401
has_ipex
=
True
except
Exception
:
pass
...
...
This diff is collapsed.
点击以展开。
预览
0%
加载中
请重试
或
添加新附件
.
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
保存评论
取消
想要评论请
注册
或
登录