From 9d569c5a38582cbf9022578c046f89a88697c493 Mon Sep 17 00:00:00 2001 From: fengjiayi Date: Thu, 3 Aug 2017 17:57:00 -0700 Subject: [PATCH 1/6] Update Backward.md Add the "Backward Operator Registry" section --- paddle/framework/backward.md | 24 ++++++++++++++++++++++-- 1 file changed, 22 insertions(+), 2 deletions(-) diff --git a/paddle/framework/backward.md b/paddle/framework/backward.md index 74c001b06a..61f308b469 100644 --- a/paddle/framework/backward.md +++ b/paddle/framework/backward.md @@ -1,8 +1,28 @@ -## Operator/expression 's Backward +# Operator/expression 's Backward -### Motivation +## Motivation In Neural Network, the backpropagation algorithm follows the chain rule, so we need to compound the fundmental gradient operators/expressions together with chain rule . Every forward network need a backward network to construct the full computation lineage, the operator/ expression's Backward feature will generate the backward pass respect to forward pass. + +## Backward Operator Registry + +A backward network is built up with several backward operators. Backward operators take forward operators' inputs, outputs and output gradients, and then calculate its input gradients. In most cases, there is a one-to-one correspondence between forward and backward operators. We use registry mechanism to save these correspondences, which is quite similar with operator registry itself. + +For example, we have got a `add_two_op`, and is registered by the following code: + +```cpp +REGISTER_OP(add_two, AddTwoOp, AddTwoOpMaker); +``` + +`add_two` is the operator's type. `AddTwoOp` and `AddTwoOpMaker` are the operator class and the operator maker class respectively. + +Assume that we have also got the backward operator of `add_two_op`, which calculating the gradients of `add_two_op`'s inputs. Then we register it by the following way: + +```cpp +REGISTER_GRADIENT_OP(add_two, add_two_grad, AddTwoGradOp); +``` + +`add_two_grad` is the type of backward operator, and `AddTwoGradOp` is its class name. ### Implement : gradient operator registry From 7304006b7121c844d071227a6c2d24245a06e32e Mon Sep 17 00:00:00 2001 From: fengjiayi Date: Tue, 8 Aug 2017 16:38:27 -0700 Subject: [PATCH 2/6] Update backward.md --- paddle/framework/backward.md | 29 ++++++++++++++++++++--------- 1 file changed, 20 insertions(+), 9 deletions(-) diff --git a/paddle/framework/backward.md b/paddle/framework/backward.md index 61f308b469..c717c2f30b 100644 --- a/paddle/framework/backward.md +++ b/paddle/framework/backward.md @@ -24,20 +24,31 @@ REGISTER_GRADIENT_OP(add_two, add_two_grad, AddTwoGradOp); `add_two_grad` is the type of backward operator, and `AddTwoGradOp` is its class name. -### Implement : gradient operator registry +## Backward Opeartor Creating -| | forward operator | backward operator | -| ---------------------- | ---------------- | -------------------------------- | -| **Operator::inputs_** | Inputs | Inputs, Outputs, OutputGradients | -| **Operator::outputs_** | Outputs | InputGradients | +### Usage -Inputs/Outputs means the input/output of the operator, InputGradients/OutputGradients is the gradient respect to forward opeartor. Forward operator and Backward operator are isomorphic, save their corresponding needs into member attribute. +Given a certain forward operator, we can get its corresponding backward opeartor by calling: -We use a global hash map record the gradient operators available, follow the philosophy of minimum core, make operator pluggable unit. Each gradient is an operator and it needs to regist itself. +```cpp +OperatorBase* bwd_op = BuildGradOp(const OperatorBase* fwd_op); +``` + +The function `BuildGradOp` will sequentially execute following processes: + +1. Getting the `type_` of given forward operator, and then creating the corresponding backward operator. + +2. Copying all the attributes of forward operator expect `input_format` and `output_format`(if it has), for their elements differ between forward and backward operators. + +3. Copying forward operator's `inputs_` and `outputs_` to backward operator's `inputs_`. And adding forward inputs' gradient variables into backward `output_`, adding forward outputs' gradient variables into backward `input_`. + +4. Building backward operator's `input_format`, `output_format` (if necessary) and `in_out_idxs_` according to its `inputs_` and `outputs_` just created. + +## Backward Network Building -grad_op_builder(fengjiayi) +A backward network is a series of backward operators. The main idea of building a backward network is creating backward operators in the inverted sequence and put them together. -### Implement : Backward network +In our design, the network itself is also a kind of operator. So the operators contained by a big network may be some small network. given a forward network, it generates the backward network. We only care about the Gradients—`OutputGradients`,`InputGradients`. From bb5c656b574b1e518da981d781db0e1e0a0e4d75 Mon Sep 17 00:00:00 2001 From: fengjiayi Date: Sat, 26 Aug 2017 19:15:31 -0700 Subject: [PATCH 3/6] test --- paddle/framework/backward.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/paddle/framework/backward.md b/paddle/framework/backward.md index c717c2f30b..d5dbd57d19 100644 --- a/paddle/framework/backward.md +++ b/paddle/framework/backward.md @@ -6,7 +6,7 @@ In Neural Network, the backpropagation algorithm follows the chain rule, so we n ## Backward Operator Registry -A backward network is built up with several backward operators. Backward operators take forward operators' inputs, outputs and output gradients, and then calculate its input gradients. In most cases, there is a one-to-one correspondence between forward and backward operators. We use registry mechanism to save these correspondences, which is quite similar with operator registry itself. +A backward network is built up with several backward operators. Backward operators take forward operators' inputs, outputs and output gradients and then calculate its input gradients. In most cases, there is a one-to-one correspondence between forward and backward operators. We use registry mechanism to save these correspondences, which is quite similar with operator registry itself. For example, we have got a `add_two_op`, and is registered by the following code: From 4590f793f111dd4fc5134ca9bbd0a213b41962b7 Mon Sep 17 00:00:00 2001 From: fengjiayi Date: Sun, 27 Aug 2017 17:37:41 -0700 Subject: [PATCH 4/6] Update backward document --- paddle/framework/backward.md | 24 ++++++++---------------- 1 file changed, 8 insertions(+), 16 deletions(-) diff --git a/paddle/framework/backward.md b/paddle/framework/backward.md index b4205fed2e..133b17c7be 100644 --- a/paddle/framework/backward.md +++ b/paddle/framework/backward.md @@ -2,32 +2,24 @@ ## Motivation -In Neural Network, the backpropagation algorithm follows the chain rule, so we need to compound the fundmental gradient operators/expressions together with chain rule . Every forward network need a backward network to construct the full computation lineage, the operator/ expression's Backward feature will generate the backward pass respect to forward pass. - +In Neural Network, the backpropagation algorithm follows the chain rule, so we need to compound the fundmental gradient operators/expressions together with chain rule . Every forward network need a backward network to construct the full computation lineage, the operator/expression's backward pass will be generated respect to forward pass. + ## Backward Operator Registry -A backward network is built up with several backward operators. Backward operators take forward operators' inputs, outputs and output gradients and then calculate its input gradients. In most cases, there is a one-to-one correspondence between forward and backward operators. We use registry mechanism to save these correspondences, which is quite similar with operator registry itself. +A backward network is built up with several backward operators. Backward operators take forward operators' inputs, outputs and output gradients and then calculate its input gradients. In most cases, there is a one-to-one correspondence between forward and backward operators. We use registry mechanism to save these correspondences. For example, we have got a `add_two_op`, and is registered by the following code: ```cpp -REGISTER_OP(add_two, AddTwoOp, AddTwoOpMaker); +REGISTER_OP(add_two, AddTwoOp, AddTwoOpMaker, add_two_grad, AddTwoGradOp); ``` `add_two` is the operator's type. `AddTwoOp` and `AddTwoOpMaker` are the operator class and the operator maker class respectively. -Assume that we have also got the backward operator of `add_two_op`, which calculating the gradients of `add_two_op`'s inputs. Then we register it by the following way: - -```cpp -REGISTER_GRADIENT_OP(add_two, add_two_grad, AddTwoGradOp); -``` - `add_two_grad` is the type of backward operator, and `AddTwoGradOp` is its class name. ## Backward Opeartor Creating -### Usage - Given a certain forward operator, we can get its corresponding backward opeartor by calling: ```cpp @@ -36,13 +28,13 @@ OperatorBase* bwd_op = BuildGradOp(const OperatorBase* fwd_op); The function `BuildGradOp` will sequentially execute following processes: -1. Getting the `type_` of given forward operator, and then creating the corresponding backward operator. +1. Get the `type_` of given forward operator, and then get the corresponding backward operator's type by looking up the `OpInfoMap`. -2. Copying all the attributes of forward operator expect `input_format` and `output_format`(if it has), for their elements differ between forward and backward operators. +2. Build two maps named `inputs` and `outputs` to temporary storage backward operator's inputs and outputs. Copy forward operator's `inputs_` and `outputs_` to map `inputs`, except these are not necessary for gradient computing. -3. Copying forward operator's `inputs_` and `outputs_` to backward operator's `inputs_`. And adding forward inputs' gradient variables into backward `output_`, adding forward outputs' gradient variables into backward `input_`. +3. Add forward inputs' gradient variables into map `output`, adding forward outputs' gradient variables into map `input`. -4. Building backward operator's `input_format`, `output_format` (if necessary) and `in_out_idxs_` according to its `inputs_` and `outputs_` just created. +4. Building backward operator with `inputs`, `outputs` and forward operator's attributes. ## Backward Network Building From 980edfa69a72f57dea689d1d5b1bff6b388e7a71 Mon Sep 17 00:00:00 2001 From: fengjiayi Date: Mon, 28 Aug 2017 11:34:24 -0700 Subject: [PATCH 5/6] Refine backward document --- paddle/framework/backward.md | 11 +++++++++-- 1 file changed, 9 insertions(+), 2 deletions(-) diff --git a/paddle/framework/backward.md b/paddle/framework/backward.md index 133b17c7be..ce324a73f0 100644 --- a/paddle/framework/backward.md +++ b/paddle/framework/backward.md @@ -6,9 +6,16 @@ In Neural Network, the backpropagation algorithm follows the chain rule, so we n ## Backward Operator Registry -A backward network is built up with several backward operators. Backward operators take forward operators' inputs, outputs and output gradients and then calculate its input gradients. In most cases, there is a one-to-one correspondence between forward and backward operators. We use registry mechanism to save these correspondences. +A backward network is built up with several backward operators. Backward operators take forward operators' inputs, outputs and output gradients and then calculate its input gradients. -For example, we have got a `add_two_op`, and is registered by the following code: +-| | forward operator | backward operator +-| ---------------------- | ---------------- |------------------------- | +-| **Operator::inputs_** | Inputs | Inputs, Outputs, OutputGradients | +-| **Operator::outputs_** | Outputs | InputGradients | + + In most cases, there is a one-to-one correspondence between forward and backward operators. These correspondences are recorded by a global hash map(`OpInfoMap`). To follow the philosophy of minimum core and make operators pluggable, the registry mechanism is introduced. + +For example, we have got a `add_two_op`, and we can register it's information and corresponding backward operator by the following macro: ```cpp REGISTER_OP(add_two, AddTwoOp, AddTwoOpMaker, add_two_grad, AddTwoGradOp); From eaeb69f98f70bbea4fe4aae9f7c7b830f75959c5 Mon Sep 17 00:00:00 2001 From: fengjiayi Date: Mon, 28 Aug 2017 13:47:37 -0700 Subject: [PATCH 6/6] Follow reviewer's comments --- paddle/framework/backward.md | 18 +++++++++--------- 1 file changed, 9 insertions(+), 9 deletions(-) diff --git a/paddle/framework/backward.md b/paddle/framework/backward.md index ce324a73f0..8aa6728a95 100644 --- a/paddle/framework/backward.md +++ b/paddle/framework/backward.md @@ -2,28 +2,28 @@ ## Motivation -In Neural Network, the backpropagation algorithm follows the chain rule, so we need to compound the fundmental gradient operators/expressions together with chain rule . Every forward network need a backward network to construct the full computation lineage, the operator/expression's backward pass will be generated respect to forward pass. +In Neural Network, the backpropagation algorithm follows the chain rule, so we need to compound the fundmental gradient operators/expressions together with chain rule . Every forward network need a backward network to construct the full computation graph, the operator/expression's backward pass will be generated respect to forward pass. ## Backward Operator Registry A backward network is built up with several backward operators. Backward operators take forward operators' inputs, outputs and output gradients and then calculate its input gradients. --| | forward operator | backward operator --| ---------------------- | ---------------- |------------------------- | --| **Operator::inputs_** | Inputs | Inputs, Outputs, OutputGradients | --| **Operator::outputs_** | Outputs | InputGradients | +| | forward operator | backward operator +| ---------------------- | ---------------- |------------------------- | +| **Operator::inputs_** | Inputs | Inputs, Outputs, OutputGradients | +| **Operator::outputs_** | Outputs | InputGradients | In most cases, there is a one-to-one correspondence between forward and backward operators. These correspondences are recorded by a global hash map(`OpInfoMap`). To follow the philosophy of minimum core and make operators pluggable, the registry mechanism is introduced. -For example, we have got a `add_two_op`, and we can register it's information and corresponding backward operator by the following macro: +For example, we have got a `mul_op`, and we can register it's information and corresponding backward operator by the following macro: ```cpp -REGISTER_OP(add_two, AddTwoOp, AddTwoOpMaker, add_two_grad, AddTwoGradOp); +REGISTER_OP(mul, MulOp, MulOpMaker, mul_grad, MulOpGrad); ``` -`add_two` is the operator's type. `AddTwoOp` and `AddTwoOpMaker` are the operator class and the operator maker class respectively. +`mul` is the operator's type. `MulOp` and `MulOpMaker` are the operator class and the operator maker class respectively. -`add_two_grad` is the type of backward operator, and `AddTwoGradOp` is its class name. +`mul_grad` is the type of backward operator, and `MulOpGrad` is its class name. ## Backward Opeartor Creating