Commit Graph

251 Commits (6a3c8725b01dedbc10f99f431ba5a4541e0e431e)

Author SHA1 Message Date
chajchaj 113810c557
fix bug of celoss when using ignore_index and reduction (#30180)
4 years ago
XiaoguangHu 6bfdef727e
clean redundant API alias in 2.0 - part 2 (#30013)
4 years ago
Leo Chen 8696335f86
Fix dtype of ungenerated grad var (#28511)
4 years ago
littletomatodonkey e03171b7c7
fix pad (#30222)
4 years ago
tangwei12 4763e6bc4e
pre padding in dygraph (#30163)
4 years ago
ceci3 6a19e41f1f
fix syncbn convert (#30158)
4 years ago
Leo Chen adac38c506
add dispenable input for core.ops.reshape2/expand/slice (#30072)
4 years ago
Zhou Wei 30888ca343
Polish and Optimize the print/repr information of Layer (#29998)
4 years ago
ceci3 a125d6331f
fix bn docs (#30096)
4 years ago
ceci3 334247791a
add attribute for batch_norm (#29950)
4 years ago
xiaoting 4d395203a2
Add alias for upsample (#29983)
4 years ago
zhupengyang 65d4ff753b
hardsigmoid add attr slope and offset (#29999)
4 years ago
Chen Long af37285870
fix code bugs (#29932)
4 years ago
XiaoguangHu 726c78f293
clean redundant API alias in 2.0 - part 1 (#29928)
4 years ago
LielinJiang 0b74428db8
Fix Conv2DTanspose bug when padding='same' (#29915)
4 years ago
Jack Zhou 84bae27779
fix wmt14 doc, remove backward, add bidirect direction in rnn api (#29633)
4 years ago
huangxu96 2cb6f94888
add float16 into adaptive_avg_pool2d check list. (#29547)
4 years ago
Leo Chen 0fdd365665
Add fast path for dropout when p == 0 (#29553)
5 years ago
huangxu96 576d0d938b
add fp16 check into max and avg pool (#29479)
5 years ago
chajchaj 79e6086743
change shape of output in cross_entropy, test=develop (#29220)
5 years ago
Guo Sheng 8fc7f1b66a
Fix api docs in RNN, Transformer, layer_norm, WeightNormParamAttr (#29235)
5 years ago
Chen Long 66fd1c00a0
fix some docs test=develop;test=document_fix (#29374)
5 years ago
Feiyu Chan f7cdcefa65
fix multiple documentation errors, test=document_fix (#29210)
5 years ago
tangwei12 8358791607
fix gpu outofrange (#29238)
5 years ago
Jack Zhou cf43322139
fix nll_loss doc;test=document_fix; (#29247)
5 years ago
LielinJiang b9f1f4343b
Move temporal_shift to paddle.nn.functional (#29261)
5 years ago
furnace 7584bb5096
Layer norm fp16 (#29169)
5 years ago
LielinJiang 8a2dd34a1e
fix depthwise conv (#29227)
5 years ago
Leo Chen 4556ad76b4
Upgrade string literals to raw string [part 2](#29217)
5 years ago
huangjun12 b6a26749dc
fix doc of alpha_dropout/dropout/dropout2d/dropout3d/npair_loss (#29136)
5 years ago
hong19860320 f23665e5d5
Refine the doc and unit test for Sigmoid and stanh (#29198)
5 years ago
danleifeng 7e7b4b9e5d
remove sampled_softmax_with_cross_entropy alias;test=develop (#29180)
5 years ago
zhang wenhui 8388abe66b
Fix api 1128 (#29174)
5 years ago
徐铭远 3c2a46bd7b
fix doc of erf,rank,mm,cross_entropy,pixel_shuffle,kron... (#29126)
5 years ago
xiaoting 9cc0e72619
Fix interpolate doc (#29104)
5 years ago
whs 9b39af3f22
Fix docs in 2.0 API (#29081)
5 years ago
Noel da71173bc9
Fix ops doc for some ops
5 years ago
whs 7de2db4a81
Fix grid_sample in cudnn mode (#29124)
5 years ago
ceci3 e7caf3b8d9
fix examples, test=document_fix (#29019)
5 years ago
Guanghua Yu 47af5c3c9d
fix smooth_l1_loss en docs (#29093)
5 years ago
LielinJiang 6951052431
add default conv init (#29092)
5 years ago
GaoWei8 a049dff78f
Modify the default setting of softmax cudnn (#28672)
5 years ago
ceci3 a3faa520ec
Fix syncbn (#29013)
5 years ago
FlyingQianMM f0e614feae
change print([.*].numpy()) to print([.*]) in example codes of sigmoid_focal_loss (#29094)
5 years ago
chajchaj dfaf6b5eea
save one name in cross_entropy and softmax_cross_entropy, test=develop (#29074)
5 years ago
joejiong 4b05a8be88
delete axis parameter in multiply api (#28647)
5 years ago
chajchaj b52427327d
add soft_label and axis for CrossEntropyLoss and improve performance (#29024)
5 years ago
Leo Chen 3815d7aa40
Upgrade string literals to raw string (#28989)
5 years ago
wanghuancoder 887a35113e
fix eng doc for some api (#28477)
5 years ago
Leo Chen 98adc8f054
Dev/fix doc of some api (#28785)
5 years ago