Skip to content

Commit

Permalink
fix the link to VPU extensibility article (openvinotoolkit#11956)
Browse files Browse the repository at this point in the history
  • Loading branch information
xu-yuan1 authored Jun 23, 2022
1 parent 5008ee8 commit 079d1d6
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion docs/Extensibility_UG/Intro.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
openvino_docs_Extensibility_UG_add_openvino_ops
openvino_docs_Extensibility_UG_Frontend_Extensions
openvino_docs_Extensibility_UG_GPU
openvino_docs_IE_DG_Extensibility_DG_VPU_Kernel
openvino_docs_Extensibility_UG_VPU_Kernel
openvino_docs_MO_DG_prepare_model_customize_model_optimizer_Customize_Model_Optimizer

@endsphinxdirective
Expand Down
2 changes: 1 addition & 1 deletion docs/Extensibility_UG/VPU_Extensibility.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# How to Implement Custom Layers for VPU (Intel® Neural Compute Stick 2) {#openvino_docs_IE_DG_Extensibility_DG_VPU_Kernel}
# How to Implement Custom Layers for VPU (Intel® Neural Compute Stick 2) {#openvino_docs_Extensibility_UG_VPU_Kernel}

To enable operations not supported by OpenVINO™ out of the box, you need a custom extension for Model Optimizer, a custom nGraph operation set, and a custom kernel for the device you will target. This page describes custom kernel support for one the VPU, the Intel® Neural Compute Stick 2 device, which uses the MYRIAD device plugin.

Expand Down

0 comments on commit 079d1d6

Please sign in to comment.