Commit Graph

23683 Commits (54dddee37e620639610bf410421bac2a22a791a2)
 

Author SHA1 Message Date
dongdaxiang 53fbab5d33 add fs_local_open example
6 years ago
dongdaxiang afaf937010 add fs_local_open example
6 years ago
dongdaxiang cf1360643f add printer for fetch variable
6 years ago
dongdaxiang d65cb13ad5 add pslib flag on fleet_wrapper CMakefile
6 years ago
dongdaxiang 6de9ebc65c refine VLOG in fleet_wrapper.h
6 years ago
dongdaxiang 97d5cd30f0 make pull dense worker work
6 years ago
dongdaxiang 39014b9f9f fix class register problem
6 years ago
dongdaxiang f0dd1201cc fix destructor problem
6 years ago
dongdaxiang f2bde9c241 fix destructor problem
6 years ago
dongdaxiang 54f047a126 fix ngraph compile option
6 years ago
dongdaxiang dd1dc9bcf0 add common.h.in back
6 years ago
dongdaxiang 378037c535 make s_instance_ private to ensure singleton
6 years ago
dongdaxiang a446d26e8a add todo for asynce executor
6 years ago
dongdaxiang c165012031 refine device_worker and trainer code
6 years ago
dongdaxiang 8a335b50be add downpour device_worker pb configuration
6 years ago
dongdaxiang 24a8001142 make -DWITH_PSLIB=ON compilable
6 years ago
dongdaxiang 67b1d6d721 add dist_multi_trainer for distributed training, add trainer_factory and device_worker_factory so that we can easily extend new training mode, add pull dense worker which is a singleton for parameter fetching
6 years ago
dongdaxiang caf0c10e71 add dist_multi_trainer for distributed training, add trainer_factory and device_worker_factory so that we can easily extend new training mode, add pull dense worker which is a singleton for parameter fetching
6 years ago
dongdaxiang 855bf579d2 add dist_multi_trainer for distributed training, add trainer_factory and device_worker_factory so that we can easily extend new training mode, add pull dense worker which is a singleton for parameter fetching
6 years ago
lujun d4f63d82ac
Merge pull request #16475 from junjun315/fix-doc-multiplex
6 years ago
Qiao Longfei d8974e6da0 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into add-async-ssa-graph-executor-communicator
6 years ago
lujun de605cc0fc
Merge pull request #16523 from junjun315/tensor_api
6 years ago
wanghaoshuang d41b623a72 Merge branch 'develop' of https://github.com/PaddlePaddle/paddle into quan_ck
6 years ago
wanghaoshuang 6db7c2a500 Fix checkpoint of quantization.
6 years ago
Shixiaowei02 bddb2cd315 resolve conflicts with the develop branch test=develop
6 years ago
lidanqing 0d656996bf fix some bugs of unzip and reading val list
6 years ago
chengduo 1096746cbf
Fuse Adam And SGD ops (#15933)
6 years ago
Jacek Czaja 2632327429 [MKL-DNN] Tensor modifications revert (#16462)
6 years ago
Zeng Jinle 4143a1c216
Merge pull request #16491 from sneaxiy/feature/advance_gc
6 years ago
chengduo 2265d091e6
Fix threaded executor bug (#16508)
6 years ago
minqiyang 9e14f260c0 Fix polynomal decay bug in python2.x
6 years ago
sneaxiy 2c836ff914 check default grad maker
6 years ago
Zhen Wang f7f5044b3d
Merge pull request #16489 from wzzju/fix_slim_quant_bugs
6 years ago
nhzlx d065b5bf2b Anakin ssd support
6 years ago
Zeng Jinle 69cb9792ea
Merge pull request #16506 from sneaxiy/revert-16424-fix_allocator_bug
6 years ago
lujun 1c9aaeebe0 move imperative to dygraph, test=develop
6 years ago
lidanqing b46e467abc add wget and unzip part and change data_dir
6 years ago
lujun d980ba19bc add some dygraph op, test=develop
6 years ago
lidanqing 894aa9b235 change script file name and data_dir location
6 years ago
lidanqing 57f51e5b08 preprocess with PIL the full val dataset and save binary
6 years ago
minqiyang 42507d33c6 Change atol to default value
6 years ago
lujun cc29bec6e6
Merge pull request #16 from PaddlePaddle/develop
6 years ago
chengduo ed61d67c73
Fix the interface of Pass::Apply (#16484)
6 years ago
dengkaipeng 193185b840 Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into shift
6 years ago
dengkaipeng 8a0023892a fix unittest. test=develop
6 years ago
minqiyang 48f3cbdf55 Polish code
6 years ago
whs 59f75ec76e
Make unitest of fsp op faster and more stable. (#16502)
6 years ago
Zeng Jinle 5f1c92a81c
Merge pull request #16450 from zhhsplendid/del-redundant-op-var-reg
6 years ago
Zhen Wang 46e1bb06c7 remove no necessary doc changes. test=develop
6 years ago
whs ecc3088df8
Fix saving in quantization strategy. (#16474)
6 years ago