!8908 modify api example

From: @lijiaqi0612
Reviewed-by: @sanjaychan,@liangchenghui,@sanjaychan
Signed-off-by: @sanjaychan
pull/8908/MERGE
mindspore-ci-bot 4 years ago committed by Gitee
commit 6a2b3a4ee1

@ -232,6 +232,8 @@ def ms_function(fn=None, obj=None, input_signature=None):
equal to the case when `fn` is not None. equal to the case when `fn` is not None.
Examples: Examples:
>>> from mindspore.ops import functional as F
>>>
>>> def tensor_add(x, y): >>> def tensor_add(x, y):
>>> z = F.tensor_add(x, y) >>> z = F.tensor_add(x, y)
>>> return z >>> return z

@ -58,6 +58,8 @@ def set_seed(seed):
TypeError: If seed isn't a int. TypeError: If seed isn't a int.
Examples: Examples:
>>> from mindspore.ops import composite as C
>>>
>>> # 1. If global seed is not set, numpy.random and initializer will choose a random seed: >>> # 1. If global seed is not set, numpy.random and initializer will choose a random seed:
>>> np_1 = np.random.normal(0, 1, [1]).astype(np.float32) # A1 >>> np_1 = np.random.normal(0, 1, [1]).astype(np.float32) # A1
>>> np_1 = np.random.normal(0, 1, [1]).astype(np.float32) # A2 >>> np_1 = np.random.normal(0, 1, [1]).astype(np.float32) # A2

@ -105,10 +105,11 @@ class Model:
>>> return out >>> return out
>>> >>>
>>> net = Net() >>> net = Net()
>>> loss = nn.SoftmaxCrossEntropyWithLogits(is_grad=False, sparse=True) >>> loss = nn.SoftmaxCrossEntropyWithLogits(sparse=True)
>>> optim = Momentum(params=net.trainable_params(), learning_rate=0.1, momentum=0.9) >>> optim = Momentum(params=net.trainable_params(), learning_rate=0.1, momentum=0.9)
>>> model = Model(net, loss_fn=loss, optimizer=optim, metrics=None) >>> model = Model(net, loss_fn=loss, optimizer=optim, metrics=None)
>>> dataset = get_dataset() >>> # For details about how to build the dataset, please refer to the tutorial document on the official website.
>>> dataset = create_custom_dataset()
>>> model.train(2, dataset) >>> model.train(2, dataset)
""" """
@ -514,9 +515,6 @@ class Model:
When setting pynative mode or CPU, the training process will be performed with dataset not sink. When setting pynative mode or CPU, the training process will be performed with dataset not sink.
Note: Note:
If dataset_sink_mode is True, epoch of training should be equal to the count of repeat
operation in dataset processing. Otherwise, errors could occur since the amount of data
is not equal to the required amount of training .
If dataset_sink_mode is True, data will be sent to device. If device is Ascend, features If dataset_sink_mode is True, data will be sent to device. If device is Ascend, features
of data will be transferred one by one. The limitation of data transmission per time is 256M. of data will be transferred one by one. The limitation of data transmission per time is 256M.
If sink_size > 0, each epoch the dataset can be traversed unlimited times until you get sink_size If sink_size > 0, each epoch the dataset can be traversed unlimited times until you get sink_size
@ -541,7 +539,7 @@ class Model:
If dataset_sink_mode is False, set sink_size as invalid. Default: -1. If dataset_sink_mode is False, set sink_size as invalid. Default: -1.
Examples: Examples:
>>> dataset = get_dataset() >>> dataset = create_custom_dataset()
>>> net = Net() >>> net = Net()
>>> loss = nn.SoftmaxCrossEntropyWithLogits(is_grad=False, sparse=True) >>> loss = nn.SoftmaxCrossEntropyWithLogits(is_grad=False, sparse=True)
>>> loss_scale_manager = FixedLossScaleManager() >>> loss_scale_manager = FixedLossScaleManager()
@ -659,7 +657,7 @@ class Model:
Dict, which returns the loss value and metrics values for the model in the test mode. Dict, which returns the loss value and metrics values for the model in the test mode.
Examples: Examples:
>>> dataset = get_dataset() >>> dataset = create_custom_dataset()
>>> net = Net() >>> net = Net()
>>> loss = nn.SoftmaxCrossEntropyWithLogits(is_grad=False, sparse=True) >>> loss = nn.SoftmaxCrossEntropyWithLogits(is_grad=False, sparse=True)
>>> model = Model(net, loss_fn=loss, optimizer=None, metrics={'acc'}) >>> model = Model(net, loss_fn=loss, optimizer=None, metrics={'acc'})

Loading…
Cancel
Save