map_dump_file_to_code

pull/9580/head
jiangzhenguang 4 years ago
parent 8ad2b47af4
commit 0dfa15a6b3

@ -46,6 +46,7 @@
#include "utils/config_manager.h"
#include "debug/data_dump/dump_json_parser.h"
#include "debug/tensor_load.h"
#include "debug/anf_ir_utils.h"
#include "backend/optimizer/graph_kernel/basic_ops_fusion.h"
#include "backend/optimizer/graph_kernel/eliminate_redundant_output.h"
#include "backend/optimizer/graph_kernel/tensor_promotion.h"
@ -1041,6 +1042,7 @@ void AscendSession::DumpAllGraphs(const std::vector<KernelGraphPtr> &all_graphs)
std::string file_name = "graph_build_" + std::to_string(graph->graph_id()) + ".ir";
DumpIR(file_name, graph, true);
DumpIRProto(graph, "vm_build_" + std::to_string(graph->graph_id()));
DumpIR("trace_code_graph", graph, true, kWholeStack);
}
#endif
}

@ -1,104 +0,0 @@
# 算子报错定位对应脚本源码
## 文档功能与适用场景
在MindSpore进行计算调试遇到算子报错时用户希望能够通过报错信息查找到对应的Python源码。现有的算子编译(PreCompileProcessFailed)和执行失败(run task error)只是将算子名字打印到终端无法直接指出用户代码中算子调用处的行号。本文的主要目的为指导用户通过算子的报错信息然后在ANF图文件里找到对应算子的源码。
此指导文档适合运行在 **Ascend硬件** 环境下的计算,且忽略算子融合场景和算子反向报错的情况。
## 解析流程与辅助工具使用
1. 查找对应源码行号的主要解析流程为:
① 获取报错算子的全称full_name(full_name由scope和算子名组成scope名会以Default或Gradients开头Default表示正向Gradients表示反向这里我们只关注Default的情况)。
② 获取报错算子的输入输出属性等信息。
③ 通过full_name和输入输出信息在[0-10]_py_pre_ad.dat(优先)或[0-10]_validate.dat文件中找到对应的算子和对应的代码行号。
2. 算子报错时使用脚本的3步操作
① 用户在训练脚本里设置context.set_context(mode=context.GRAPH_MODE, save_graphs=True),进行图文件的保存。
② 用户在执行代码时将日志信息定向到文件中使用方法为python xxx.py &> log_name &
③ 通过辅助脚本解析报错算子全称、输入输出信息,同时尝试查找对应代码行号。
&nbsp; 脚本名: **find_error_operator_in_code.py**
&nbsp; 执行方式:
```
python3 find_error_operator_in_code.py
--log_path [the path of log, default is the current path](option)
--log_name [the file name of log](required, log_name)
```
3. 解析效果
解析文件时通常有3种情况
① 全匹配(算子名称、输入输出的shape与dtype都匹配时)
```
[INFO] Detect "task exception error". 【检测报错类型】
[INFO] Find operation 1 times! 【在python源码中查找匹配到的次数】
[INFO] In file test_mul.py(29)/ return self.mul(x, y) 【查找到的文件名、行号与该行号的代码内容】
[INFO] Exception operator is "Default/Mul". 【出错的算子】
[INFO] Have 2 input in operator: 【该出错算子有2个输入】
input 1/1th: dtype is float16, shape is [526338, 21]. 【第一个输入的dtype和shape1/1th表示第一个输入的第1个值】
input 1/2th: dtype is float16, shape is [526338, 1]. 【第二个输入的dtype和shape】
[INFO] Have 1 output in operator: 【该出错算子有1个输出】
output 1/1th: dtype is float16, shape is [526338, 21]. 【第一个输出的dtype和shape1/1th表示第一个输出的第1个值】
```
② 单匹配(算子名称匹配、但只匹配输入或输出的shape与dtype都匹配时):
```
[INFO] Detect "compile error". 【检测报错类型】
[INFO] Find operation 1 times! 【在python源码中查找匹配到的次数】
[INFO] In file split_ops.py(17)/ return self.net(input) 【查找到的文件名、行号与该行号的代码内容】
[INFO] Exception operator is "Default/Split". 【出错的算子】
[INFO] Have 1 input in operator: 【该出错算子有1个输入】
input 1/1th: dtype is float32, shape is [32, 192, 56, 56]. 【第一个输入的dtype和shape1/1th表示第一个输入的第1个值】
[WARNING] Cannot match output information! Please check whether the operator's output is: 【输出未完全匹配告警】
[INFO] Have 1 output in operator:             【该出错算子有1个输出】
output 1/1th: dtype is float32, shape is [32, 6, 56, 56].            【第一个输出的dtype和shape1/1th表示第一个输出的第1个值】
output 2/1th: dtype is float32, shape is [32, 6, 56, 56].            【第一个输出的dtype和shape2/1th表示第一个输出的第2个值】
output 3/1th: dtype is float32, shape is [32, 6, 56, 56].            【第一个输出的dtype和shape3/1th表示第一个输出的第3个值】
```
③ 未匹配(未匹配到算子名称,或者匹配算子名称但输入输出未匹配时):
```
[INFO] Detect "task exception error". 【检测报错类型】
[WARNING] Cannot find operation! Need to find in the script based on the following information: 【未在源码中匹配到算子告警】
[INFO] Exception operator full name is "Default/test". 【出错的算子】
[INFO] Have 2 input in operator: 【该出错算子有2个输入】
input 1/1th: dtype is float16, shape is [526338, 21]. 【第一个输入的dtype和shape1/1th表示第一个输入的第1个值】
input 1/2th: dtype is float16, shape is [526338, 1]. 【第二个输入的dtype和shape】
[INFO] Have 1 output in operator:     【该出错算子有1个输出】
output 1/1th: dtype is float16, shape is [526338, 21]. 【第一个输出的dtype和shape1/1th表示第一个输出的第1个值】
[WARNING] Do you want to research in source code? set source code path to research or press enter to research in current path, input n/no to exit.
Input: 【未匹配到代码时会提示是否在脚本中进行搜索n/no表示不搜索传入代码路径在指定路径搜索回车默认表示当前路径搜索】
```
4. 手动代码查找
这里还会存在些特殊情况例如用户在源码中调用框架提供的nn.cell网络层会发现查找出来的为框架里的代码行号。此时用户需要利用工具查找出的full_name和输入输出信息回到源码中进行对应代码的查找。
举个例子说明如何手动在代码中查找指定full_name和shape的算子例如full_name为: Default/network/network/aspp/aspp_pooling/ResizeNearestNeighbor输入的shape为[8, 256, 1, 1] dtype为float32。
可以观察到其scope为: Default/network/network/aspp/aspp_pooling算子名为: ResizeNearestNeighbor。注意scope中会存在Default、network自动填充Default表示正向network为网络名。
查看以下用户定义的代码首先我们先分析scope: Default/network/network/aspp/aspp_pooling。由network/aspp可定位到算子的定义与调用处分别为26行与31行继续由network/aspp/aspp_pooling可以定位到定义与调用处分别为4行与8行然后通过算子名ResizeNearestNeighbor可以定位至定义与调用处分别为16行与19行。最后若存在相同scope下存在相同的算子名时需要通过输入的shape和dtype进行进一步判断。
```
1 class ASPP(nn.Cell):
2 def __init__(self):
3 super(ASPP, self).__init__()
4 self.aspp_pooling = ASPPPooling()
5 self.drop = nn.Dropout(0.3)
6
7 def construct(self, x):
8 x5 = self.aspp_pooling(x)
9 x = self.drop(x)
10 return x
11
12 class ASPPPooling(nn.Cell):
13 def __init__(self):
14 super(ASPPPooling, self).__init__()
15 self.shape = P.Shape()
16 self.resizenearestneighbor = P.ResizeNearestNeighbor((size[2], size[3]), True)
17 def construct(self, x):
18 size = self.shape(x)
19 out = self.resizenearestneighbor(out)
20 return out
21
22 # 主结构
23 class DeepLabV3(nn.Cell):
24 def __init__(self, phase='train', num_classes=21, output_stride=16, freeze_bn=False):
25 super(DeepLabV3, self).__init__()
26 self.aspp = ASPP()
27 self.shape = P.Shape()
28
29 def construct(self, x):
30 size = self.shape(x)
31 out = self.aspp(out)
32 return out
```

@ -0,0 +1,96 @@
# 映射数据文件到对应的脚本源码
## 文档功能与适用场景
在MindSpore进行计算调试怀疑遇到精度问题时可以选择dump文件进行对比。此时用户希望知道dump文件夹下的每个数据文件对应的Python源码。
本文的主要目的为指导用户使用该工具进行数据文件到python源码的映射。
此指导文档适合运行在 **Ascend硬件** 环境下的计算。
## 辅助工具使用
1. 使用脚本的3步操作
① 用户在训练脚本里设置context.set_context(mode=context.GRAPH_MODE, save_graphs=True),进行图文件的保存。
② 用户开启dump数据功能参考<https://www.mindspore.cn/tutorial/training/zh-CN/r1.0/advanced_use/custom_debugging_info.html>
③ 获取dump数据文件的op_num然后通过辅助脚本进行解析。如数据文件Default--network-TrainOneStepCell--network-WithLossCell--_backbone-
&nbsp; &nbsp; ResNet--layer2-SequentialCell--0-ResidualBlock--conv2-Conv2d--Cast-op954_input_0_shape_128_128_3_3_kNumberTypeFloat32_DefaultFormat.bin.
&nbsp; &nbsp; 可观察到Cast-op954说明该算子的op_num为op954。
脚本名: **map_file_to_code.py**; &nbsp; 执行方式:
```ruby
python3 map_file_to_code.py
--graph_path(-p) [the graph path, default is the current path](option)
--dump_op(-o) [Dump operator id, case insensitive, such as 'op954'.](required)
For example:
python3 map_file_to_code.py -p graph_path -o op954
```
2. 解析效果
解析文件时通常有2种情况
① 匹配时会显示出调用栈过程,需要用户在调用栈中查找自己的源码:
```ruby
[INFO] Start to map the dump file to source code.
[INFO] Find operation 'Cast'.
In file /data1/jzg/mindspore/mindspore/nn/layer/conv.py(253)/
output = self.conv2d(x, self.weight)
In file /data1/jzg/dump_to_code/resnet/scripts/train/src/resnet.py(166)/
out = self.conv2(out)
In file /data1/jzg/mindspore/mindspore/nn/layer/container.py(173)/
for cell in self.cell_list:
In file /data1/jzg/dump_to_code/resnet/scripts/train/src/resnet.py(323)/ # 用户代码行
c3 = self.layer2(c2)
In file /data1/jzg/mindspore/mindspore/train/amp.py(101)/
out = self._backbone(data)
In file /data1/jzg/mindspore/mindspore/nn/wrap/cell_wrapper.py(247)/
loss = self.network(*inputs)
In file /data1/jzg/mindspore/mindspore/train/dataset_helper.py(87)/
return self.network(*outputs)
```
② 未匹配,在图中未找对应节点的调用栈:
```ruby
[INFO] Start to map the dump file to source code.
[WARNING] Cannot find cast's source code in ir file. # 未找到cast算子的信息
```
3. 手动代码查找
这里还会存在些特殊情况需要用户进行自行查找。通过将dump的数据文件名中的'--'替换为'/'可获取到算子的full_name。input和output文件名shape后面的数据为对应算子的输入输出shape信息。然后利用算子的full_name和输入输出信息回到源码中进行对应代码的查找。
举个例子说明如何手动在代码中查找指定full_name和shape的算子例如full_name为: Default/network/network/aspp/aspp_pooling/ResizeNearestNeighbor输入的shape为[8, 256, 1, 1] dtype为float32。
可以观察到其scope为: Default/network/network/aspp/aspp_pooling算子名为: ResizeNearestNeighbor。注意scope中会存在Default、network自动填充Default表示正向network为网络名。
查看以下用户定义的代码首先我们先分析scope: Default/network/network/aspp/aspp_pooling。由network/aspp可定位到算子的定义与调用处分别为26行与31行继续由network/aspp/aspp_pooling可以定位到定义与调用处分别为4行与8行然后通过算子名ResizeNearestNeighbor可以定位至定义与调用处分别为16行与19行。最后若存在相同scope下存在相同的算子名时需要通过输入的shape进行进一步判断。
```ruby
1 class ASPP(nn.Cell):
2 def __init__(self):
3 super(ASPP, self).__init__()
4 self.aspp_pooling = ASPPPooling()
5 self.drop = nn.Dropout(0.3)
6
7 def construct(self, x):
8 x5 = self.aspp_pooling(x)
9 x = self.drop(x)
10 return x
11
12 class ASPPPooling(nn.Cell):
13 def __init__(self):
14 super(ASPPPooling, self).__init__()
15 self.shape = P.Shape()
16 self.resizenearestneighbor = P.ResizeNearestNeighbor((size[2], size[3]), True)
17 def construct(self, x):
18 size = self.shape(x)
19 out = self.resizenearestneighbor(out)
20 return out
21
22 # 主结构
23 class DeepLabV3(nn.Cell):
24 def __init__(self, phase='train', num_classes=21, output_stride=16, freeze_bn=False):
25 super(DeepLabV3, self).__init__()
26 self.aspp = ASPP()
27 self.shape = P.Shape()
28
29 def construct(self, x):
30 size = self.shape(x)
31 out = self.aspp(out)
32 return out
```

@ -0,0 +1,156 @@
# Copyright 2020 Huawei Technologies Co., Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ============================================================================
"""map_file_to_code"""
import os
import argparse
class ParseIrInfo:
"""
Parse and return the operation info from ir file.
"""
def __init__(self, ir_file):
self.no_in_file_operation = []
self.ir_file_path = self.ir_path_parse(ir_file)
self.operation_info_dict = self.ir_info_parse()
def __len__(self):
return len(self.operation_info_dict)
def ir_path_parse(self, ir_file):
"""
parse the map file path.
"""
if ir_file == "":
print("[WARNING] No graph_path parameter, use current path as graph path.")
ir_file = os.path.abspath(os.path.dirname(__file__))
map_ir_file = ""
file_size = 0
map_ir_filename = "trace_code_graph"
for filename in os.listdir(os.path.join(ir_file)):
if map_ir_filename not in filename:
continue
tmp_file = os.path.join(ir_file, filename)
tmp_file_size = os.path.getsize(tmp_file)
if tmp_file_size >= file_size:
file_size = tmp_file_size
map_ir_file = tmp_file
if map_ir_file == "":
exit("[ERROR] Please set \"save_graphs=True\" in context to save {} file!".format(map_ir_filename))
return map_ir_file
def ir_info_parse(self):
"""
parse the ir file and save code line corresponding to the operator
"""
all_op_info_dict = {} # recode all operation info
single_op_info_dict = {} # recode single operation info
op_start_char_flag = False # Start operator fragment
op_end_char_flag = False # End of operator fragment
op_start_info_num = 0 # Accumulate the num to recode operation
operation_line = 0 # The line number of the operator
op_start_line_num = 0 # The line number of starting operator information
op_start_info_flag = False # Start operator information
with open(self.ir_file_path, 'r+') as file:
txt_context_list = file.readlines()
for line_num, txt_context in enumerate(txt_context_list):
txt_context = txt_context.strip()
# Start operator fragment
if txt_context.endswith(") {"):
op_start_char_flag = True
op_end_char_flag = False
# End of operator fragment
if txt_context == "}":
op_end_char_flag = True
# Determine whether it is operator information
if txt_context.startswith("%") and ") = " in txt_context and txt_context[1].isdigit():
op_start_info_flag = True
op_start_line_num = line_num
op_start_info_num += 1
single_op_info_dict = {"in_file": []}
# Judge and start to recode operation info
if op_start_char_flag and not op_end_char_flag and op_start_info_flag and line_num != op_start_line_num:
if "-op" in txt_context and txt_context.split("-op")[-1].split(")")[0].isdigit():
single_op_info_dict["origin_op_name"] = txt_context.split("-op")[0].split("/")[-1]
single_op_info_dict["op_name"] = txt_context.split("-op")[0].split("/")[-1].lower()
single_op_info_dict["op_num"] = "op" + txt_context.split("-op")[-1].split(")")[0]
operation_line = line_num
if "In file" in txt_context:
in_file_info = txt_context.split("#")[-1].strip().rstrip("/")
single_op_info_dict["in_file"].append(in_file_info)
if line_num - operation_line == 1 and "In file" not in txt_context and "op_num" in single_op_info_dict:
self.no_in_file_operation.append(single_op_info_dict["op_num"])
op_start_info_flag = False
all_op_info_dict[op_start_info_num] = single_op_info_dict
return all_op_info_dict
class MapOperationToLine:
"""
to show operation info
"""
def __init__(self, dump_op, ir_info_dict):
self.dump_op = dump_op
self.ir_info_dict = ir_info_dict
def show_operator_info(self):
"""
find operator
"""
origin_dump_op_name = self.dump_op.split("-")[0]
dump_op_name = origin_dump_op_name.lower()
dump_op_num = self.dump_op.split("-")[-1]
for _, op_info in self.ir_info_dict.items():
if op_info["op_num"] == dump_op_num and op_info["in_file"] is not None:
if dump_op_name in (dump_op_num, op_info["op_name"]):
if not op_info["in_file"]:
print("[WARNING] Cannot find {}'s source code in ir file.".format(op_info["origin_op_name"]))
return False
print("[INFO] Find operation '{}'.".format(op_info["origin_op_name"]))
for line in op_info["in_file"]:
print(" {}".format(line.split(" ")[0]))
print(" {}".format(line.split(" ")[-1]))
return True
print("[WARNING] Cannot find operation {}'s in ir file.".format(origin_dump_op_name))
return False
def start_find(dump_op, map_code_file):
"""
start find error operation in code.
"""
print("[INFO] Start to map the dump file to source code.")
ir_op_info_dict = ParseIrInfo(map_code_file).operation_info_dict
MapOperationToLine(dump_op, ir_op_info_dict).show_operator_info()
if __name__ == "__main__":
parser = argparse.ArgumentParser(description='Find the dump operator in the user code')
parser.add_argument('--graph_path', '-p', type=str.lower, default="", help='Save graph files path (option)')
parser.add_argument('--dump_op', '-o', type=str.lower, default="", required=True,
help="Dump operator id, case insensitive, such as 'op3352'.")
args_opt = parser.parse_args()
start_find(args_opt.dump_op, args_opt.graph_path)
Loading…
Cancel
Save