TensorRT zoo C++ API: https://docs.nvidia.com/deeplearning/sdk/tensorrt-api/c_api/_nv_infer_plugin_8h.html
- SSD-VGG16
- TensorRT 3.0 : https://github.com/chenzhi1992/TensorRT-SSD
- SSD-MobileNet
- TensorRT 4.0(class 4) : https://github.com/Ghustwb/MobileNet-SSD-TensorRT
- SSDLite-MobileNet V2
- TF to Caffe : https://github.com/chuanqi305/MobileNetv2-SSDLite
-
YOLOv2
-
YOLOv3
-
TinyYOLO
-
Openpose
- Openpose plus : https://github.com/tensorlayer/openpose-plus
- Openpose-trt-opt : https://github.com/haanjack/openpose-trt-optimize
- slide: https://www.slideshare.net/deview/232-dl-inference-optimization-using-tensor-rt-1-119162975
- slide: https://imsc.uni-graz.at/haasegu/Lectures/HPC-II/SS15/Siegmann/nvvp_memcheck.pdf
- nvidia slide: https://www.olcf.ornl.gov/wp-content/uploads/2018/02/SummitDev_NVIDIA-Profilers.pdf
- tuitorial: https://docs.nvidia.com/cuda/profiler-users-guide/index.html#nvprof-command-line-options-print
- nvidia blog: https://devblogs.nvidia.com/cuda-pro-tip-nvprof-your-handy-universal-gpu-profiler/