Skip to content

Commit

Permalink
add fallback for xformers_attnblock_forward
Browse files Browse the repository at this point in the history
  • Loading branch information
AUTOMATIC1111 committed Oct 8, 2022
1 parent a5550f0 commit f9c5da1
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion modules/sd_hijack_optimizations.py
Original file line number Diff line number Diff line change
Expand Up @@ -211,11 +211,14 @@ def cross_attention_attnblock_forward(self, x):
return h3

def xformers_attnblock_forward(self, x):
try:
h_ = x
h_ = self.norm(h_)
q1 = self.q(h_).contiguous()
k1 = self.k(h_).contiguous()
v = self.v(h_).contiguous()
out = xformers.ops.memory_efficient_attention(q1, k1, v)
out = self.proj_out(out)
return x+out
return x + out
except NotImplementedError:
return cross_attention_attnblock_forward(self, x)

0 comments on commit f9c5da1

Please sign in to comment.