Timed dataflow: Reducing communication overhead for distributed machine learning systems
Many distributed machine learning (ML) systems exhibit high communication overhead when dealing with big data sets. Our investigations showed that popular distributed ML systems could spend about an order of magnitude more time on network communication than computation to train ML models containing...
Saved in:
Main Authors: | SUN, Peng, WEN, Yonggang, TA, Nguyen Binh Duong, YAN, Shengen |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2016
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/4834 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
Towards distributed machine learning in shared clusters: A dynamically-partitioned approach
by: SUN, Peng, et al.
Published: (2017) -
GraphMP: an efficient semi-external-memory big graph processing system on a single machine
by: SUN, Peng, et al.
Published: (2017) -
DATAFLOW MODELING STRATEGIES : DATAFLOW MODELLING FOR PANELING OF COMPLEX SURFACE
by: ZHANG QIAOJIE
Published: (2010) -
TOWARD GENERAL-PURPOSE DYNAMIC DATAFLOW PROCESSING
by: LEE JINHO
Published: (2024) -
Interactivity-constrained server provisioning in large-scale distributed virtual environments
by: TA, Nguyen Binh Duong, et al.
Published: (2011)