Commit Graph

17 Commits (b2c1be851a3b9e6d01dab1d741fcb05e5fc4b016)

Author SHA1 Message Date
Zhaolong Xing c5f0293cf3
NV jetson(nano, tx2, xavier) inference compile support (#21393)
6 years ago
peizhilin 439691f5bd adjust the shlwapi on windows
7 years ago
Yu Yang 7604b1ad51 Fix Eigen macro when using GPU
7 years ago
Clementine 6c71c1f8f9 Add activation gelu (#14569)
7 years ago
dzhwinter eca4563e5d
operators module (#12938)
7 years ago
dzhwinter 39ac9e39c2
float16 type support enhance (#12181)
7 years ago
Kexin Zhao 64bf3df0f9 add print support to float16 (#9960)
8 years ago
Kexin Zhao 0f38bb4593
add fp16 support to activation op (#9769)
8 years ago
Kexin Zhao b2a1c9e8b7 Add float16 support to non-cudnn softmax op on GPU (#9686)
8 years ago
Kexin Zhao d307b5e4a6 Merge remote-tracking branch 'upstream/develop' into elementwise_add_fp16
8 years ago
Kexin Zhao 182da95317 small fix
8 years ago
Kexin Zhao f2bbbb2b66 fix arithmetic operator
8 years ago
Kexin Zhao 18d616ed70 add float16 arithmetic operators on new GPU
8 years ago
kexinzhao 266ccaa843
Integrate float16 into data_type_transform (#8619)
8 years ago
Yu Yang d50016b2a7 Remove build warnings in float16.h (#8481)
8 years ago
kexinzhao 74e0eb7267
make float16 a pod type (#8456)
8 years ago
kexinzhao f82fa64a06
Move float16 into fluid folder (#8394)
8 years ago