You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
Paddle/paddle/fluid/inference/api
silingtong123 fc4435174b
test=develop, fix the bug of tensorrt package can't compile on windows (#24860)
5 years ago
..
demo_ci support C++ inference shared library on windows (#24672) 5 years ago
details support C++ inference shared library on windows (#24672) 5 years ago
CMakeLists.txt [Inference] [unittest] Inference unit tests rely on dynamic libraries (#24743) 5 years ago
README.md Combine Inference Analysis with IR (#13914) 6 years ago
analysis_config.cc update the analysis predictor for multi-stream support, test=develop (#24046) 5 years ago
analysis_predictor.cc Replace all errors thrown by LOG(FATAL) with PADDLE_THROW (#24759) 5 years ago
analysis_predictor.h Add some inference API comments for AnalysisPredictor (#23242) 5 years ago
analysis_predictor_tester.cc INT8 Fully-connected (#17641) 5 years ago
api.cc support C++ inference shared library on windows (#24672) 5 years ago
api_impl.cc Add macro BOOST_GET to enrich the error information of boost :: get (#24175) 5 years ago
api_impl.h make clone thread safe (#15363) 6 years ago
api_impl_tester.cc Replace all errors thrown by LOG(FATAL) with PADDLE_THROW (#24759) 5 years ago
api_tester.cc add version support (#15469) 6 years ago
helper.cc rollback paddle_inference_helper.h to helper.h 6 years ago
helper.h add check for assigned data, test=develop (#22960) 5 years ago
high_level_api.md remove anakin from code, test=develop (#22420) 5 years ago
high_level_api_cn.md remove anakin from code, test=develop (#22420) 5 years ago
mkldnn_quantizer.cc Add macro BOOST_GET to enrich the error information of boost :: get (#24175) 5 years ago
mkldnn_quantizer.h INT8 Fully-connected (#17641) 5 years ago
mkldnn_quantizer_config.cc Add support for INT8 matmul in C-API quantization (#23463) 5 years ago
paddle_analysis_config.h support C++ inference shared library on windows (#24672) 5 years ago
paddle_api.h support C++ inference shared library on windows (#24672) 5 years ago
paddle_infer_declare.h test=develop, fix the bug of tensorrt package can't compile on windows (#24860) 5 years ago
paddle_inference_api.h remove anakin from code, test=develop (#22420) 5 years ago
paddle_mkldnn_quantizer_config.h support C++ inference shared library on windows (#24672) 5 years ago
paddle_pass_builder.cc [oneDNN] Fix to inplace pass (#24442) 5 years ago
paddle_pass_builder.h support C++ inference shared library on windows (#24672) 5 years ago

README.md

Embed Paddle Inference in Your Application

Paddle inference offers the APIs in C and C++ languages.

You can easily deploy a model trained by Paddle following the steps as below:

  1. Optimize the native model;
  2. Write some codes for deployment.

The APIs

All the released APIs are located in the paddle_inference_api.h header file. The stable APIs are wrapped by namespace paddle, the unstable APIs are protected by namespace paddle::contrib.

Write some codes

Read paddle_inference_api.h for more information.