1. 程式人生 > >hadoop使用常見問題總結!

hadoop使用常見問題總結!

1,執行 hdfs dfs -copyFromLocal 命令報錯!

19/01/02 11:01:32 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.net.ConnectException: 拒絕連線
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1702)
	at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1432)
	at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1385)
	at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:554)
19/01/02 11:01:32 INFO hdfs.DFSClient: Abandoning BP-719105237-127.0.0.1-1525595716995:blk_1073741854_1030
19/01/02 11:01:32 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[127.0.0.1:50010,DS-ababb49a-42c6-452b-9992-e0dc201a08b5,DISK]
19/01/02 11:01:32 WARN hdfs.DFSClient: DataStreamer Exception
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /tempdata/README.md._COPYING_ could only be replicated to 0 nodes instead of minReplication (=1).  There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
	at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1628)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3121)

解決辦法: 關閉防火牆: 

systemctl -stop firewalld.service 

永久關閉防火牆:

systemctl disable firewalld.service

也可以永久關閉selinux:使用  vim /etc/selinux/config 命令修改/etc/selinux/config 檔案 
設定“SELINUX=disabled” ,再次執行上傳檔案操作成功。

2,出現 Unable to load native-hadoop library for your platform的警告資訊處理方式!

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builti

 在/HADOOP_HOME/etc/hadoop/(也就是hadoop的配置檔案目錄下的)中的hadoop_env.sh頭部添加了如下資訊:

export HADOOP_COMMON_LIB_NATIVE_DIR="/usr/local/hadoop/lib/native/"
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/native/"

  其中的 /usr/local/hadoop是我的haddoop安裝目錄,請根據自己的安裝目錄設定即可!