You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've upgraded the code to be compatible with PyTorch2.0 and also replaced the attention/crossattention node to use PyTorch's build-in Multihead Attention which also of course supports flash attention out of the box.
Is there any interest in these updates?
The text was updated successfully, but these errors were encountered:
I've upgraded the code to be compatible with PyTorch2.0 and also replaced the attention/crossattention node to use PyTorch's build-in Multihead Attention which also of course supports flash attention out of the box.
Is there any interest in these updates?
The text was updated successfully, but these errors were encountered: