Skip to content
/ angel Public
forked from Angel-ML/angel

A Flexible and Powerful Parameter Server for large-scale machine learning

License

Notifications You must be signed in to change notification settings

flycrab/angel

This branch is 637 commits behind Angel-ML/angel:master.

Folders and files

NameName
Last commit message
Last commit date

Latest commit

958dba5 · Aug 22, 2019
Aug 20, 2019
Dec 18, 2018
Jun 6, 2017
Aug 6, 2018
Jun 8, 2019
Jun 16, 2017
Aug 16, 2019
Aug 20, 2019
Jan 7, 2019
Aug 8, 2019
Jul 30, 2019
Aug 15, 2019
Jul 12, 2018
Aug 15, 2019
Aug 19, 2019
Aug 3, 2018
Aug 22, 2019
Aug 14, 2019
Aug 16, 2019

Repository files navigation

license Release Version PRs Welcome CII Best Practices

This project Angel is a high-performance distributed machine learning platform based on the philosophy of Parameter Server. It is tuned for performance with big data from Tencent and has a wide range of applicability and stability, demonstrating increasing advantage in handling higher dimension model. Angel is jointly developed by Tencent and Peking University, taking account of both high availability in industry and innovation in academia. Angel is developed with Java and Scala. It supports running on Yarn and Kubernetes. With the PS Service abstraction, it provides two modules, namely Spark on Angel and Pytorch on Angel separately, which enable integrate the power of Spark/PyTorch and Parameter Server for distributed training. Graph Computing and deep learning frameworks support is under development and will be released in the future.

We welcome everyone interested in machine learning to contribute code, create issues or pull requests. Please refer to Angel Contribution Guide for more detail.

Quick Start

Deployment

Algorithm

Development

Community

FAQ

Papers

  1. Lele Yu, Bin Cui, Ce Zhang, Yingxia Shao. LDA*: A Robust and Large-scale Topic Modeling System. VLDB, 2017
  2. Jiawei Jiang, Bin Cui, Ce Zhang, Lele Yu. Heterogeneity-aware Distributed Parameter Servers. SIGMOD, 2017
  3. Jie Jiang, Lele Yu, Jiawei Jiang, Yuhong Liu and Bin Cui. Angel: a new large-scale machine learning system. National Science Review (NSR), 2017
  4. Jie Jiang, Jiawei Jiang, Bin Cui and Ce Zhang. TencentBoost: A Gradient Boosting Tree System with Parameter Server. ICDE, 2017
  5. Jiawei Jiang, Bin Cui, Ce Zhang and Fangcheng Fu. DimBoost: Boosting Gradient Boosting Decision Tree to Higher Dimensions. SIGMOD, 2018.

About

A Flexible and Powerful Parameter Server for large-scale machine learning

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Java 76.3%
  • Scala 18.4%
  • Perl 5.0%
  • Shell 0.2%
  • Jupyter Notebook 0.1%
  • CSS 0.0%