Skip to content

Commit

Permalink
attention upload
Browse files Browse the repository at this point in the history
  • Loading branch information
xmu-xiaoma666 committed May 18, 2021
1 parent 58393a8 commit d62f88a
Show file tree
Hide file tree
Showing 14 changed files with 21 additions and 14 deletions.
28 changes: 14 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ Pytorch implementation of ["CBAM: Convolutional Block Attention Module---ECCV201

#### 1.3. Code
```python
from ExternalAttention import ExternalAttention
from attention.ExternalAttention import ExternalAttention
import torch

input=torch.randn(50,49,512)
Expand All @@ -43,7 +43,7 @@ print(output.shape)

#### 1.3. Code
```python
from SelfAttention import ScaledDotProductAttention
from attention.SelfAttention import ScaledDotProductAttention
import torch

input=torch.randn(50,49,512)
Expand All @@ -63,7 +63,7 @@ print(output.shape)

#### 3.3. Code
```python
from SimplifiedSelfAttention import SimplifiedScaledDotProductAttention
from attention.SimplifiedSelfAttention import SimplifiedScaledDotProductAttention
import torch

input=torch.randn(50,49,512)
Expand All @@ -84,7 +84,7 @@ print(output.shape)

#### 4.3. Code
```python
from SEAttention import SEAttention
from attention.SEAttention import SEAttention
import torch

input=torch.randn(50,512,7,7)
Expand All @@ -96,16 +96,16 @@ print(output.shape)

***

### 4. SK Attention Usage
#### 4.1. Paper
### 5. SK Attention Usage
#### 5.1. Paper
["Selective Kernel Networks"](https://arxiv.org/pdf/1903.06586.pdf)

#### 4.2. Overview
#### 5.2. Overview
![](./img/SK.png)

#### 4.3. Code
#### 5.3. Code
```python
from SKAttention import SKAttention
from attention.SKAttention import SKAttention
import torch

input=torch.randn(50,512,7,7)
Expand All @@ -116,18 +116,18 @@ print(output.shape)
```


### 5. CBAM Attention Usage
#### 5.1. Paper
### 6. CBAM Attention Usage
#### 6.1. Paper
["CBAM: Convolutional Block Attention Module"](https://openaccess.thecvf.com/content_ECCV_2018/papers/Sanghyun_Woo_Convolutional_Block_Attention_ECCV_2018_paper.pdf)

#### 5.2. Overview
#### 6.2. Overview
![](./img/CBAM1.png)

![](./img/CBAM2.png)

#### 5.3. Code
#### 6.3. Code
```python
from CBAM import CBAMBlock
from attention.CBAM import CBAMBlock
import torch

input=torch.randn(50,512,7,7)
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
Binary file added attention/__pycache__/CBAM.cpython-38.pyc
Binary file not shown.
Binary file not shown.
Binary file added attention/__pycache__/SEAttention.cpython-38.pyc
Binary file not shown.
Binary file added attention/__pycache__/SKAttention.cpython-38.pyc
Binary file not shown.
Binary file not shown.
Binary file not shown.
7 changes: 7 additions & 0 deletions main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
from attention.ExternalAttention import ExternalAttention
import torch

input=torch.randn(50,49,512)
ea = ExternalAttention(d_model=512,S=8)
output=ea(input)
print(output.shape)

0 comments on commit d62f88a

Please sign in to comment.