Skip to content

Commit

Permalink
Outlook Attention
Browse files Browse the repository at this point in the history
  • Loading branch information
xmu-xiaoma666 committed Jun 28, 2021
1 parent fc96c23 commit 03af22b
Show file tree
Hide file tree
Showing 4 changed files with 38 additions and 4 deletions.
34 changes: 34 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,7 @@ $ pip install dlutils_add

- [16. AFT Attention Usage](#16-AFT-Attention-Usage)

- [17. Outlook Attention Usage](#17-Outlook-Attention-Usage)

- [MLP Series](#mlp-series)

Expand Down Expand Up @@ -114,6 +115,9 @@ $ pip install dlutils_add

- Pytorch implementation of ["An Attention Free Transformer---ICLR2021 (Apple New Work)"](https://arxiv.org/pdf/2105.14103v1.pdf)


- Pytorch implementation of [VOLO: Vision Outlooker for Visual Recognition---arXiv 2021.06.24"](https://arxiv.org/abs/2106.13112)

***


Expand Down Expand Up @@ -482,6 +486,36 @@ print(output.shape)
```






### 17. Outlook Attention Usage

#### 17.1. Paper


[VOLO: Vision Outlooker for Visual Recognition"](https://arxiv.org/abs/2106.13112)


#### 17.2. Overview
![](./img/OutlookAttention.png)

#### 17.3. Code
```python
from attention.OutlookAttention import OutlookAttention
import torch
from torch import nn
from torch.nn import functional as F

input=torch.randn(50,28,28,512)
outlook = OutlookAttention(dim=512)
output=outlook(input)
print(output.shape)

```


***

# MLP Series
Expand Down
Binary file not shown.
Binary file added img/OutlookAttention.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
8 changes: 4 additions & 4 deletions main.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
from attention.AFT import AFT_FULL
from attention.OutlookAttention import OutlookAttention
import torch
from torch import nn
from torch.nn import functional as F

input=torch.randn(50,49,512)
aft_full = AFT_FULL(d_model=512, n=49)
output=aft_full(input)
input=torch.randn(50,28,28,512)
outlook = OutlookAttention(dim=512)
output=outlook(input)
print(output.shape)

0 comments on commit 03af22b

Please sign in to comment.