2015년 7월 21일 화요일

HADOOP EROORS

1.There are 0 datanode(s) running and no node(s) are excluded in this operation


solution> 
     stop-all.sh
     rm rf datanode dir in hdfs-site.xml
     datanode format : hadoop datanode format
     start-all.sh


2. outofmemoryerror gc overhead limit exceeded


solution> 
     open hadoop-env.sh an edit below
     export HADOOP_CLIENT_OPTS="-Xmx100000m (much size in a ram)
 

댓글 없음:

댓글 쓰기