1. In the hadoop environment built, datanode uses private network ip (172.16.1.142172.16.1.148), and only 142nodes activate external network ip
.2. The data on the HDFS is obtained remotely through Java, and the datanode address returned to the Java program is that the intranet ip,Java cannot be connected. What should I do?
3. The parameter configuration.set ("dfs.client.use.datanode.hostname", "true") is set. The error message is as follows:
2018-08-17 14:22:38,670 INFO [org.apache.hadoop.hdfs.DFSClient] - Exception in createBlockOutputStream
java.io.IOException: Got error, status message , ack with firstBadLink as 172.16.1.148:50010
at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:140)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1363)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1266)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:449)
2018-08-17 14:22:38,673 INFO [org.apache.hadoop.hdfs.DFSClient] - Abandoning BP-1760557445-172.16.1.142-1534486349045:blk_1073741829_1005
2018-08-17 14:22:38,701 INFO [org.apache.hadoop.hdfs.DFSClient] - Excluding datanode DatanodeInfoWithStorage[172.16.1.148:50010,DS-dd301dfb-ae4e-4adc-8152-e11bbf0880e0,DISK]-sharp-sharp-sharp