IREE (Intermediate Representation Execution Environment, pronounced as "eerie") is an MLIR-based end-to-end compiler that lowers Machine Learning (ML) models to a unified IR optimized for real-time inference on mobile/edge devices against heterogeneous hardware accelerators. IREE also provides flexible deployment solutions for its compiled ML models.
See our website for project details, user guides, and instructions on building from source.
IREE is still in its early phase. We have settled down on the overarching infrastructure and are actively improving various software components as well as project logistics. It is still quite far from ready for everyday use and is made available without any support at the moment. With that said, we welcome any kind of feedback on any communication channels!
- GitHub issues: Feature requests, bugs, and other work tracking
- IREE Discord server: Daily development discussions with the core team and collaborators
- iree-discuss email list: Announcements, general and low-priority discussion
- MLIR topic within LLVM Discourse: IREE is enabled by and heavily relies on MLIR. IREE sometimes is referred to in certain MLIR discussions. Useful if you are also interested in MLIR evolution.
- 2020-08-20: IREE CodeGen: MLIR Open Design Meeting Presentation (recording and slides)
- 2020-03-18: Interactive HAL IR Walkthrough (recording)
- 2020-01-31: End-to-end MLIR Workflow in IREE: MLIR Open Design Meeting Presentation (recording and slides)
IREE is licensed under the terms of the Apache license. See LICENSE for more information.