Skip to content
This repository has been archived by the owner on Jan 26, 2021. It is now read-only.

microsoft/DMTK

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

47 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DMTK

Distributed Machine Learning Toolkit https://www.dmtk.io Please open issues in the project below. For any technical support email to [email protected]

DMTK includes the following projects:

  • DMTK framework(Multiverso): The parameter server framework for distributed machine learning.
  • LightLDA: Scalable, fast and lightweight system for large-scale topic modeling.
  • LightGBM: LightGBM is a fast, distributed, high performance gradient boosting (GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks.
  • Distributed word embedding: Distributed algorithm for word embedding implemented on multiverso.

Updates

2017-02-04

  • A tutorial on the latests updates of Distributed Machine Learning is presented on AAAI 2017. you can download the slides here.

2016-11-21

  • Multiverso has been officially used in Microsoft CNTK to power its ASGD parallel training.

2016-10-17

  • LightGBM has been released. which is a fast, distributed, high performance gradient boosting (GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks.

2016-09-12

  • A talk on the latest updates of DMTK is presented on GTC China. We also described the latest research work from our team, including the lightRNN(to be appeared in NIPS2016) and DC-ASGD.

2016-07-05

  • Multiverso has been upgrade to new API.Overview
  • Deep learning framework (torch/theano) support has been added.
  • Python/Lua bidding has been supported, you can using multiverso with Python/Lua.

Microsoft Open Source Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.