first of all, to introduce my situation, I have a physical machine as a master node, which is referred to as master later. There are also two additional servers, later known as node1 and node2. Where the docker node slave1-slave10, is configured on node1 and the node is configured as slave11-slave20 in node2.
mastersshmasternodeiptablesNATFORWARD ACCEPT()docker
10hdfsbug:
18/07/31 15:42:20 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.123.1:8032
18/07/31 15:42:21 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.123.1:8032
18/07/31 15:42:22 INFO mapred.FileInputFormat: Total input paths to process : 1
18/07/31 15:42:25 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.IOException: Got error, status message , ack with firstBadLink as 192.168.123.24:50010
at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:142)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1482)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1385)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:554)
18/07/31 15:42:25 INFO hdfs.DFSClient: Abandoning BP-557839422-192.168.123.1-1533022646989:blk_1073741831_1007
18/07/31 15:42:25 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[192.168.123.24:50010,DS-7062ca95-5971-4c80-87f7-5ea1a2f9f448,DISK]
18/07/31 15:42:30 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.IOException: Got error, status message , ack with firstBadLink as 192.168.123.19:50010
at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:142)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1482)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1385)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:554)
18/07/31 15:42:30 INFO hdfs.DFSClient: Abandoning BP-557839422-192.168.123.1-1533022646989:blk_1073741832_1008
18/07/31 15:42:30 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[192.168.123.19:50010,DS-9f25c91c-4b25-4dc3-9581-581ba2d4d79c,DISK]
18/07/31 15:42:41 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.IOException: Got error, status message , ack with firstBadLink as 192.168.123.22:50010
at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:142)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1482)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1385)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:554)
18/07/31 15:42:41 INFO hdfs.DFSClient: Abandoning BP-557839422-192.168.123.1-1533022646989:blk_1073741833_1009
18/07/31 15:42:41 INFO hdfs.DFSClient: Excluding datanode DatanodeInfoWithStorage[192.168.123.22:50010,DS-45f819cc-a3b5-44a9-8a98-75f9442d5dd4,DISK]
18/07/31 15:42:45 INFO hdfs.DFSClient: Exception in createBlockOutputStream
java.io.IOException: Got error, status message , ack with firstBadLink as 192.168.123.17:50010
at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:142)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1482)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1385)
at org.apache.hadoop.hdfs.DFSOutp
when I inquire on the Internet, most of the reasons are that my firewall is not turned off, but my master and node, whether iptables or ufw, have been turned off, while docker does not have a firewall, even if I forcibly install a firewall on docker and turn it off.
Thank you ~