@ -66,7 +66,7 @@ To train the model, run `train.py`. If the `mindrecord_dir` is empty, it will ge
sh run_distribute_train.sh 8 500 0.2 coco /data/hccl.json
```
The input parameters are device numbers, epoch size, learning rate, dataset mode and [hccl json configuration file](https://www.mindspore.cn/tutorial/en/master/advanced_use/distributed_training.html). **It is better to use absolute path.**
The input parameters are device numbers, epoch size, learning rate, dataset mode and [hccl json configuration file](https://gitee.com/mindspore/mindspore/tree/master/model_zoo/utils/hccl_tools). **It is better to use absolute path.**
You will get the loss value of each step as following:
@ -71,7 +71,7 @@ sh run_distribute_train.sh dataset/coco2014 yolov3_darknet_noquant_ckpt/0-320_10
sh run_standalone_train.sh dataset/coco2014 yolov3_darknet_noquant_ckpt/0-320_102400.ckpt
```
> About rank_table.json, you can refer to the [distributed training tutorial](https://www.mindspore.cn/tutorial/en/master/advanced_use/distributed_training.html).
> About rank_table.json, You can generate it by using the [hccl json configuration file](https://gitee.com/mindspore/mindspore/tree/master/model_zoo/utils/hccl_tools).
@ -51,7 +51,7 @@ To train the model, run `train.py` with the dataset `image_dir`, `anno_path` and
sh run_distribute_train.sh 8 150 /data/Mindrecord_train /data /data/train.txt /data/hccl.json
```
The input variables are device numbers, epoch size, mindrecord directory path, dataset directory path, train TXT file path and [hccl json configuration file](https://www.mindspore.cn/tutorial/en/master/advanced_use/distributed_training.html). **It is better to use absolute path.**
The input variables are device numbers, epoch size, mindrecord directory path, dataset directory path, train TXT file path and [hccl json configuration file](https://gitee.com/mindspore/mindspore/tree/master/model_zoo/utils/hccl_tools). **It is better to use absolute path.**
You will get the loss value and time of each step as following: