Commit Graph

196 Commits (c95ed54fe15350fdd7b933d8a0b26c17210c7385)

Author SHA1 Message Date
Yi Huaijie e4cd67596f raise RuntimeError when using full_batch neither under semi_auto_parallel nor auto_parallel
4 years ago
Wan Hanyang 0b7570eb53 add model with loss, without loso and o2 test case
4 years ago
Wan Hanyang 2ceea1e59d add a self attention test case
4 years ago
Su Teng 7b46f46a65 remove unuse test
4 years ago
Yi Huaijie eb83ea9607 change internal API _get_strategy() to _get_shard_strategy()
4 years ago
Yi Huaijie a836d25c64 change API set_strategy() to shard()
4 years ago
mindspore-ci-bot b40677002f !5714 [refine]change top graph and add cell class
5 years ago
Wei Luning e6f82af849 add cell class to c++
5 years ago
lichenever f2d3fd34ce rectification_allreduce_fusion_api
5 years ago
yao_yf d4cfe55c04 rename mirror_mean to gradients_mean
5 years ago
mindspore-ci-bot 9018737e99 !5696 [Auto parallel] Move 'multi-subgraphs' interface to internal
5 years ago
mindspore-ci-bot c064c01b6b !5729 [AutoParallel]Add FuseBatchNormEx op
5 years ago
mindspore-ci-bot 7786adc3aa !5722 fix semi auto parallel parameter of reshape has another user
5 years ago
lichenever d22f506431 add BatchNormEx op
5 years ago
yao_yf 05c003ae6b origin/semi_auto_parallel_reshape_parameter_has_another_user
5 years ago
mindspore-ci-bot fc79997de5 !5502 Mod SoftmaxCrossEntropyWithlogits
5 years ago
Xiaoda Zhang 42f1241270 remove 'multi-subgraphs' to internal
5 years ago
wanyiming 0ec70068ae mod_SoftmaxCrossEntropyWithLogits
5 years ago
mindspore-ci-bot 35e6cca1a3 !5634 wrap numpy random seed into an api
5 years ago
Yi Huaijie 4a5d115a66 add get_seed() and set_seed()
5 years ago
mindspore-ci-bot ccc0ea60ee !5661 fix auto parallel reshape strategy set when it is first operator
5 years ago
yao_yf 755f381406 fix auto parallel reshape strategy set when it is first operator
5 years ago
yao_yf 8f7aa5bd5a auto parallel context modify
5 years ago
mindspore-ci-bot be606ba8f5 !5432 Mindspore parallel supports all elementary-wise operators
5 years ago
Yi Huaijie 84948ca730 parallel supports more elementary-wise operators
5 years ago
mindspore-ci-bot 414184c184 !5367 Check the parameter's split strategies if it has multiple users
5 years ago
yao_yf 07117e4dd4 mv ParallelMode to context
5 years ago
yangzhenzhang fbda03bbcc check parameter split
5 years ago
mindspore-ci-bot 66d6320b21 !5224 Add test case about loss scale in parallel mode
5 years ago
yangzhenzhang 6ae5893681 add test cases
5 years ago
panyifeng 1a54785fe2 remove name arg from gradoperation
5 years ago
mindspore-ci-bot 7d4f481884 !5017 remove internal interface in wide&deep
5 years ago
mindspore-ci-bot abe6b82138 !5011 remove global grad ops
5 years ago
yao_yf a9a8e323b2 remove internal interface in wide&deep
5 years ago
mindspore-ci-bot fc6eee3bda !5019 raise RuntimeError when set different mode after Initializer created
5 years ago
panyifeng 637e812347 remove global grad ops
5 years ago
Yi Huaijie 394be43492 raise RuntimeError when set different mode after Initializer created
5 years ago
Su Teng e3ae23c939 add parallel attention test
5 years ago
mindspore-ci-bot 3d06cbf987 !4801 Must set or change parallel mode before any Initializer created
5 years ago
Yi Huaijie 89a4ebf8a1 parallel mode must be set before create an initializer
5 years ago
mindspore-ci-bot 9ee144ea40 !4744 [AutoParallel]Support bert
5 years ago
lichenever 221a801395 auto parallel support bert
5 years ago
yangzhenzhang cda08f6a52 concat 3 tensors in auto parallel mode
5 years ago
mindspore-ci-bot 2ae6365d77 !4650 EmbeddingLookup support auto parallel
5 years ago
yangzhenzhang 6f6a8ae9f0 embedding lookup auto parallel
5 years ago
Yi Huaijie 0f7ead5f14 parameter slice init test all initializers
5 years ago
yao_yf cbb4363fa7 remove to_full_tensor and load_inputs in exexute stage
5 years ago
yangzhenzhang 14c77c9f03 update field split
5 years ago
mindspore-ci-bot 2db0290c49 !4356 Add validation for field split
5 years ago
yangzhenzhang 4a0e6ff7fc update field split
5 years ago