Hadoop ERROR datanode DataNode java io IOException Incompatible namespaceIDs in app hadoop tmp dfs data namenode namespaceID 1580775695 datanode namespaceID 1494801914

0 votes
Can anyone solve the above issue and give solution to this above error?
Aug 5, 2019 in Big Data Hadoop by Sudarshan
257 views

1 answer to this question.

0 votes

Hi,

Namenode generates new namespaceID every time you format HDFS. 

bin#vi /app/hadoop/tmp/dfs/data/current/VERSION

Manually update the namespaceID = 1494801914 to 1580775695, and save the file:

namespaceID=1580775695

storageID=DS-469635027-127.0.0.1-50010-1376974011263

cTime= 1377582559943

storageType=DATA_NODE

layoutVersion=-32
answered Aug 5, 2019 by Gitika
• 65,870 points

Related Questions In Big Data Hadoop

0 votes
1 answer

Hadoop: namenode -format error "java.io.IOException"

Run the command as sudo or add the ...READ MORE

answered Nov 26, 2018 in Big Data Hadoop by Omkar
• 69,090 points
1,116 views
0 votes
1 answer

Hadoop giving java.io.IOException, in mkdir Java code.

I am not sure about the issue. ...READ MORE

answered May 3, 2018 in Big Data Hadoop by Shubham
• 13,480 points
1,175 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
6,809 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
1,092 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
47,752 views
–1 vote
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,390 points
2,586 views
0 votes
1 answer

Hadoop: ERROR datanode.DataNode: All directories in dfs.data.dir are invalid.

Hi, Try this, first delete all contents from ...READ MORE

answered Aug 5, 2019 in Big Data Hadoop by Gitika
• 65,870 points
572 views