Skip to content

Latest commit

 

History

History
34 lines (23 loc) · 2.12 KB

README_ENGLISH.md

File metadata and controls

34 lines (23 loc) · 2.12 KB

MegCC

logo

What is MegCC

MegCC is a deep learning model compiler with features:

  • Extremely Light Binary Size :Compile mobilenetv1 only, the runtime binary size is only 81KB after stripped
  • High Performance :Every operation is carefully optimized on arm,faster than MegEngine
  • Portable:Very easy to run on Android,TEE,BareMeta
  • Low Memory Usage and Fast Boot:With global static memory optimize algorithm inside and static binding at compile time

MegCC compiler is developed based on MLIR infrastructure, now almost all the code generated by the compiler is optimized by hand。MegCC support compile with static shape and dynamic shape,to acheve the minimum binary size, it also support generating the necessary cv operators in C.

After compiled a model, MegCC generates the kernels used by the model and user required cv kernels, at the same time, it do static memory plan, model optimization and dump them into the final tinynn model.

MegCC runtime will load the tinynn model, call the generated kernels to finish the model inference. It is only 81KB binary size to inference mobilenetv1(fp32).

MegCC support Arm64/ArmV7/X86/BareMatal backend now, the supported operator detail ref to operator lists.

MegCC Structure

megcc_struct

Documentation

Get MegCC

How to use MegCC

  • Read 如何使用 to see how to compile your models and deploy them,also there is a Engilish doc how to use.

  • MegCC runtime is easy to run in standard OS, even no OS(example).

Thanks a lot, please enjoy it