Commit Graph

26 Commits (725e64486acb0f448ae7365a6467ad37229bab2a)

Author SHA1 Message Date
emailweixu 725e64486a cumsum operator (#8288)
7 years ago
Qiao Longfei 20c4a4cb4f
Impl scalar switch case op with condition op (#8184)
7 years ago
dangqingqing 4673a24bda Add softmax into Python API.
7 years ago
Qiao Longfei be801d6c05
Add learning rate decay (#7892)
7 years ago
ying dcb5a1ed67 fix ci.
7 years ago
Yu Yang c80af6ffaa
Merge pull request #7721 from reyoung/feature/rename_fluid
7 years ago
fengjiayi e8adcaf278 update
7 years ago
fengjiayi 2b8ea2171e Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into dev_global_norm_clip
7 years ago
Yang Yu 5c26f60875 Fix license
7 years ago
Yang Yu 3008aab6ee Merge branch 'develop' of github.com:baidu/Paddle into feature/rename_fluid
7 years ago
dzhwinter e983cc90fc
"fix decode bug" (#7711)
7 years ago
Yang Yu 22662ae424 Move paddle.v2.fluid.registery to layers
7 years ago
fengjiayi 773f2f735c fix errors
7 years ago
fengjiayi 51985aa2aa Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into dev_global_norm_clip
7 years ago
fengjiayi 228e14adb7 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into dev_global_norm_clip
7 years ago
fengjiayi d15bfabbd0 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into dev_elementwise_max_min
7 years ago
dzhwinter b9b75377a2
Feature/hooks (#7513)
7 years ago
fengjiayi ee8e5374d8 add max min layer
7 years ago
fengjiayi adc26dffa9 developing GradientClipByGlobalNorm
7 years ago
ying 8ac744f372 add wrapper for elementwise math operator.
7 years ago
fengjiayi d23ea4ef8e add gradient clip by norm
7 years ago
QI JUN 87f9b58363
set stop gradient for mask in dropout layer (#7390)
7 years ago
xuwei06 585dec3dc2 Calculating gradients for partial graph
7 years ago
Yang Yu 90a5a55a6c Expose some activations
7 years ago
yangyaming fa5cdd8f74 Expose sequence_softmax_op.
8 years ago
Yu Yang e0698e33a8
Make layers as a python module (#6564)
8 years ago