DEVELOPMENT OF DEGRADED NODE DETECTION DUE TO BANDWIDTH REDUCTION WHEN STARTUP PHASE OF MAPREDUCE HADOOP
Hadoop is a framework and software used to store and process large amounts of data in a distributed manner. To overcome the failure of a node, Hadoop has a feature called speculative execution. But a study entitled Limplock states that speculative execution cannot detect nodes that experience interf...
Saved in:
Main Author: | Raudi Avinanto - NIM : 13514087, Praditya |
---|---|
Format: | Final Project |
Language: | Indonesia |
Online Access: | https://digilib.itb.ac.id/gdl/view/29958 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Institut Teknologi Bandung |
Language: | Indonesia |
Similar Items
-
Hadoop debug optimization
by: Siow, Llukelly Jian Yang.
Published: (2012) -
Hadoop on data analytics
by: Gee, Denny Jee King.
Published: (2012) -
Evaluation of Hadoop with Hottub
by: KARIM (NIM : 13514075), KRISTIANTO -
A New Node Centroid Algorithm for Bandwidth Minimization
by: Rodrigues, Brian, et al.
Published: (2003) -
Twitter data processing using hadoop
by: Khuc, Anh Tuan.
Published: (2011)