Skip to content

Commit

Permalink
pad mask issue and embedding issue
Browse files Browse the repository at this point in the history
  • Loading branch information
GJ98 committed Apr 9, 2021
1 parent d117fbb commit f8ec5c5
Show file tree
Hide file tree
Showing 2 changed files with 1 addition and 3 deletions.
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -3,4 +3,4 @@ data/
venv/
.data/
*.pt
_pycache__
__pycache__
2 changes: 0 additions & 2 deletions models/layers/scale_dot_product_attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,6 @@ def forward(self, q, k, v, mask=None, e=1e-12):
k_t = k.view(batch_size, head, d_tensor, length) # transpose
score = (q @ k_t) / math.sqrt(d_tensor) # scaled dot product

print("score : {}" .format(score.size()))
print("mask : {}" .format(mask.size()))
# 2. apply masking (opt)
if mask is not None:
score = score.masked_fill(mask == 0, -e)
Expand Down

0 comments on commit f8ec5c5

Please sign in to comment.