For more details please check out our [Architecture Guide](https://www.mindspore.cn/doc/note/en/master/design/mindspore/architecture.html).
@ -50,7 +50,7 @@ TensorFlow adopted static calculation diagrams in the early days, whereas PyTorc
But MindSpore finds another way, automatic differentiation based on source code conversion. On the one hand, it supports automatic differentiation of automatic control flow, so it is quite convenient to build models like PyTorch. On the other hand, MindSpore can perform static compilation optimization on neural networks to achieve great performance.
The implementation of MindSpore automatic differentiation can be understood as the symbolic differentiation of the program itself. Because MindSpore IR is a functional intermediate expression, it has an intuitive correspondence with the composite function in basic algebra. The derivation formula of the composite function composed of arbitrary basic functions can be derived. Each primitive operation in MindSpore IR can correspond to the basic functions in basic algebra, which can build more complex flow control.
@ -58,9 +58,9 @@ The implementation of MindSpore automatic differentiation can be understood as t
The goal of MindSpore automatic parallel is to build a training method that combines data parallelism, model parallelism, and hybrid parallelism. It can automatically select a least cost model splitting strategy to achieve automatic distributed parallel training.
At present, MindSpore uses a fine-grained parallel strategy of splitting operators, that is, each operator in the figure is splitted into a cluster to complete parallel operations. The splitting strategy during this period may be very complicated, but as a developer advocating Pythonic, you don't need to care about the underlying implementation, as long as the top-level API compute is efficient.
At present, MindSpore uses a fine-grained parallel strategy of splitting operators, that is, each operator in the figure is split into a cluster to complete parallel operations. The splitting strategy during this period may be very complicated, but as a developer advocating Pythonic, you don't need to care about the underlying implementation, as long as the top-level API compute is efficient.
## Installation
@ -229,7 +229,7 @@ currently the containerized build options are supported as follows:
```
If you want to learn more about the building process of MindSpore docker images,
please check out [docker](docker/README.md) repo for the details.
please check out [docker](https://gitee.com/mindspore/mindspore/blob/master/docker/README.md) repo for the details.
## Quickstart
@ -256,12 +256,13 @@ Check out how MindSpore Open Governance [works](https://gitee.com/mindspore/comm
## Contributing
Welcome contributions. See our [Contributor Wiki](CONTRIBUTING.md) for
Welcome contributions. See our [Contributor Wiki](https://gitee.com/mindspore/mindspore/blob/master/CONTRIBUTING.md) for
more details.
## Maintenance phases
Project stable branches will be in one of the following states: