"update img alt"

enforce_failed
dongzhihong 8 years ago
parent a7e3325aad
commit bc9e20d9ed

@ -60,7 +60,7 @@ A backward network is a series of backward operators. The main idea of building
1. Op
when the input forward network is an Op, return its gradient Operator Immediately. If all of its outputs are in no gradient set, then return a special `NOP`.
When the input forward network is an Op, return its gradient Operator Immediately. If all of its outputs are in no gradient set, then return a special `NOP`.
2. NetOp
@ -72,12 +72,12 @@ A backward network is a series of backward operators. The main idea of building
4. Sharing Variables
**sharing variables**. As illustrated in the pictures, two operator's `Output` `Gradient` will overwrite their sharing input variable.
**sharing variables**. As illustrated in the pictures, two operator's share the same variable name of W@GRAD, which will overwrite their sharing input variable.
<p align="center">
<img src="./images/duplicate_op.png" width="50%" ><br/>
<img src="./images/duplicate_op.png" alt="Sharing variables in operators." width="50%"><br/>
pic 1. Sharing variables in operators.
pic 1.
</p>

Loading…
Cancel
Save