Merge pull request #5015 from ranqiu92/develop

Fix a bug of dot-product attention.
revert-4814-Add_sequence_project_op
Cao Ying 7 years ago committed by GitHub
commit 7d653c41cd

@ -1457,11 +1457,13 @@ def dot_product_attention(encoded_sequence,
expanded = expand_layer(
input=transformed_state,
expanded_as=encoded_sequence,
expand_as=encoded_sequence,
name='%s_expand' % name)
m = linear_comb_layer(
weights=expanded, vectors=encoded_sequence, name='%s_dot-product')
weights=expanded,
vectors=encoded_sequence,
name='%s_dot-product' % name)
attention_weight = fc_layer(
input=m,

Loading…
Cancel
Save