Merge pull request #7307 from drinktee/keyboard

fix ports_num_for_sparse defaut value in cluster_train doc
detection_output_fixbug
武毅 7 years ago committed by GitHub
commit e3d296fd67
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -51,7 +51,7 @@ $ stdbuf -oL /usr/bin/nohup paddle pserver --port=7164 --ports_num=1 --ports_num
- port**必选默认7164**pserver监听的起始端口根据ports_num决定总端口个数从起始端口监听多个端口用于通信 - port**必选默认7164**pserver监听的起始端口根据ports_num决定总端口个数从起始端口监听多个端口用于通信
- ports_num**必选默认1**,监听的端口个数 - ports_num**必选默认1**,监听的端口个数
- ports_num_for_sparse**必选,默认1**,用于稀疏类型参数通信的端口个数 - ports_num_for_sparse**必选,默认0**,用于稀疏类型参数通信的端口个数
- num_gradient_servers**必选默认1**当前训练任务pserver总数 - num_gradient_servers**必选默认1**当前训练任务pserver总数
### 启动计算节点 ### 启动计算节点
@ -95,7 +95,7 @@ paddle.init(
- trainer_count**必选默认1**当前训练任务trainer总个数 - trainer_count**必选默认1**当前训练任务trainer总个数
- port**必选默认7164**连接到pserver的端口 - port**必选默认7164**连接到pserver的端口
- ports_num**必选默认1**连接到pserver的端口个数 - ports_num**必选默认1**连接到pserver的端口个数
- ports_num_for_sparse**必选,默认1**和pserver之间用于稀疏类型参数通信的端口个数 - ports_num_for_sparse**必选,默认0**和pserver之间用于稀疏类型参数通信的端口个数
- num_gradient_servers**必选默认1**当前训练任务pserver总数 - num_gradient_servers**必选默认1**当前训练任务pserver总数
- trainer_id**必选默认0**每个trainer的唯一ID从0开始的整数 - trainer_id**必选默认0**每个trainer的唯一ID从0开始的整数
- pservers**必选默认127.0.0.1**当前训练任务启动的pserver的IP列表多个IP使用“,”隔开 - pservers**必选默认127.0.0.1**当前训练任务启动的pserver的IP列表多个IP使用“,”隔开

@ -52,7 +52,7 @@ Parameter Description
- port: **required, default 7164**, port which parameter server will listen on. If ports_num greater than 1, parameter server will listen on multiple ports for more network throughput. - port: **required, default 7164**, port which parameter server will listen on. If ports_num greater than 1, parameter server will listen on multiple ports for more network throughput.
- ports_num: **required, default 1**, total number of ports will listen on. - ports_num: **required, default 1**, total number of ports will listen on.
- ports_num_for_sparse: **required, default 1**, number of ports which serves sparse parameter update. - ports_num_for_sparse: **required, default 0**, number of ports which serves sparse parameter update.
- num_gradient_servers: **required, default 1**, total number of gradient servers. - num_gradient_servers: **required, default 1**, total number of gradient servers.
### Starting trainer ### Starting trainer
@ -98,7 +98,7 @@ Parameter Description
- trainer_count: **required, default 1**, total count of trainers in the training job. - trainer_count: **required, default 1**, total count of trainers in the training job.
- port: **required, default 7164**, port to connect to parameter server. - port: **required, default 7164**, port to connect to parameter server.
- ports_num: **required, default 1**, number of ports for communication. - ports_num: **required, default 1**, number of ports for communication.
- ports_num_for_sparse: **required, default 1**, number of ports for sparse type caculation. - ports_num_for_sparse: **required, default 0**, number of ports for sparse type caculation.
- num_gradient_servers: **required, default 1**, total number of gradient server. - num_gradient_servers: **required, default 1**, total number of gradient server.
- trainer_id: **required, default 0**, ID for every trainer, start from 0. - trainer_id: **required, default 0**, ID for every trainer, start from 0.
- pservers: **required, default 127.0.0.1**, list of IPs of parameter servers, separated by ",". - pservers: **required, default 127.0.0.1**, list of IPs of parameter servers, separated by ",".

Loading…
Cancel
Save