Commit Graph

74 Commits (ef56e6839a9f084f27a2857f18eaed24f317b943)

Author SHA1 Message Date
Yibing Liu ef56e6839a Correct the usage of fc in the example of dynamic_lstm's doc
8 years ago
Yibing Liu 3b0eff6196 Format the writing in doc of dynamic_lstm
8 years ago
Yibing Liu f46257fa4a Merge branch 'develop' of upstream into add_lstm_doc
8 years ago
Yibing Liu f050390754 Polish the doc of dynamic_lstm
8 years ago
ying e043c2ce45 Merge branch 'develop' into wraper_for_l2_normalize
8 years ago
guosheng 66054984cd Merge branch 'develop' of https://github.com/PaddlePaddle/paddle into add-dot_product_attention
8 years ago
guosheng 9bcb2d268e Add python wrapper for matmul_op and dot_product_attention
8 years ago
ying 87a59d65d6 Merge branch 'develop' into wraper_for_l2_normalize
8 years ago
Yibing Liu aab4cfeb65 Add doc for dynamic_lstm python api
8 years ago
caoying03 6497bff901 add python wrapper for l2 normalize.
8 years ago
dzhwinter b9b75377a2
Feature/hooks (#7513)
8 years ago
guosheng 234013a9d7 Add python wrapper for matmul_op
8 years ago
guosheng ef129718ea Merge branch 'develop' of https://github.com/PaddlePaddle/paddle into add-python-glu
8 years ago
guosheng c083a60d7a Add python split and glu
8 years ago
dzhwinter 5ad1aef051
"cudnn operators change to cudnn kernel" (#6660)
8 years ago
ying 8ac744f372 add wrapper for elementwise math operator.
8 years ago
Abhinav Arora ea782e38a6
Fix typo in batch norm bias initialization (#7449)
8 years ago
QI JUN 87f9b58363
set stop gradient for mask in dropout layer (#7390)
8 years ago
QI JUN fe341bacde
refine batch norm python layer (#7348)
8 years ago
yangyaming 67b8c09210 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into fix-7211
8 years ago
yangyaming 872b1c8884 Stop gradient when pool_type=='max'
8 years ago
Siddharth Goyal 691b5cac61 Fix equation for gru op (#7274)
8 years ago
Cao Ying 0b6c5e6d25
Merge pull request #7223 from PaddlePaddle/emailweixu-patch-1
8 years ago
Siddharth Goyal f3c42f607c
Add doc for gru_unit op (in fluid) (#7151)
8 years ago
emailweixu ebc616b0f9
Fix comment to layers.fc
8 years ago
yangyaming 60fecce43d Fix unit test for lstm_unit.
8 years ago
yangyaming c0f6f492bc Add shape info for arguments.
8 years ago
yangyaming d6ec963047 Minor correction.
8 years ago
yangyaming f0e797e5b7 Doc fix and enhancement for lstm_unit python wrapper.
8 years ago
Siddharth Goyal 87f46ebb36 Add squared error layers doc (#6862)
8 years ago
Yibing Liu 3a93fa7708
Merge pull request #7069 from kuke/fix_cross_entropy_doc
8 years ago
chengduo f9a1229666
Merge pull request #6850 from chengduoZH/feature/conv2d_python_doc
8 years ago
chengduoZH 3d2b2d408f refine doc
8 years ago
Yibing Liu e2c2652fc0 amend comments in cross_entropy_op
8 years ago
Yibing Liu 4177e80545 Add line feed character in the doc of cross_entropy
8 years ago
Yibing Liu c67c54a8e7 Polish the doc of cross_entropy
8 years ago
fengjiayi bff0cbfcd3
Merge pull request #7025 from JiayiFeng/rename_output_of_softmax_and_activitions
8 years ago
fengjiayi e41a71cea8 fix errors
8 years ago
guosheng 97d47ca3fc Add python wrapper for reduce_max and reduce_min
8 years ago
chengduoZH 9e7c068677 fix embedding example
8 years ago
chengduoZH 1d936f1dfa refine
8 years ago
Luo Tao 36acbba674 Merge branch 'develop' into seq_pool_doc
8 years ago
Guo Sheng c5a2672a07
Merge pull request #6779 from guoshengCS/add-python-reduceMean
8 years ago
Luo Tao f3fc8de1d5 add doc for sequence_first/last_step
8 years ago
caoying03 852cd544a9 fix latex equation in fluid fc layer.
8 years ago
Luo Tao d7a9bb6e19 add python wrap for sequence_first/last_step
8 years ago
Cao Ying 298dc8958d
Merge pull request #6792 from lcy-seso/refine_doc
8 years ago
Abhinav Arora 91911f4b56
Fix documentation of embedding layer (#6854)
8 years ago
chengduoZH e902c36cdf add conv2d_python doc
8 years ago
caoying03 a74db488f7 follow comments.
8 years ago