How to run Hadoop 2 1 0 in Windows System

0 votes

I am new to Hadoop and have run into problems trying to run it on my Windows 7 machine. Particularly I am interested in running Hadoop 2.1.0 as its release notes mention that running on Windows is supported. I know that I can try to run 1.x versions on Windows with Cygwin or even use prepared VM by for example Cloudera, but these options are in some reasons less convenient for me.

Having examined a tarball from http://apache-mirror.rbc.ru/pub/apache/hadoop/common/hadoop-2.1.0-beta/ I found that there really are some *.cmd scripts that can be run without Cygwin. Everything worked fine when I formatted HDFS partition but when I tried to run hdfs namenode daemon I faced two errors: first, non-fatal, was that winutils.exe could not be found (it really wasn't present in the tarball downloaded). I found the sources of this component in the Apache Hadoop sources tree and compiled it with Microsoft SDK and MSBuild. Thanks to detailed error message it was clear where to put the executable to satisfy Hadoop. But the second error which is fatal doesn't contain enough information for me to solve:

13/09/05 10:20:09 FATAL namenode.NameNode: Exception in namenode join
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
    at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
    at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:423)
    at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:952)
    at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:451)
    at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:282)
    at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:200)
...
13/09/05 10:20:09 INFO util.ExitUtil: Exiting with status 1

Looks like something else should be compiled. I'm going to try to build Hadoop from the source with Maven but isn't there a simpler way? Isn't there some option-I-know-not-of that can disable native code and make that tarball usable on Windows?

Thank you.

UPDATED. Yes, indeed. "Homebrew" package contained some extra files, most importantly winutils.exe and hadoop.dll. With these files, name node and data node started successfully. I think the question can be closed. I didn't delete it in case someone faces the same difficulty.

UPDATED 2. To build the "homebrew" package I did the following:

  1. Got sources, and unpacked them.
  2. Read carefully BUILDING.txt.
  3. Installed dependencies:
    3a) Windows SDK 7.1
    3b) Maven (I used 3.0.5) 3c) JDK (I used 1.7.25)
    3d) ProtocolBuffer (I used 2.5.0 - http://protobuf.googlecode.com/files/protoc-2.5.0-win32.zip). It is enough just to put compiler (protoc.exe) into some of the PATH folders.
    3e) A set of UNIX command-line tools (I installed Cygwin)
  4. Started command line of Windows SDK. Start | All programs | Microsoft Windows SDK v7.1 | ... Command Prompt (I modified this shortcut, adding option /release in the command line to build release versions of native code). All the next steps are made from inside SDK command line window)
  5. Set up the environment:

    set JAVA_HOME={path_to_JDK_root}

It seems that JAVA_HOME MUST NOT contain space!

set PATH={path_to_maven_bin};%PATH%  
set Platform=x64  
set PATH={path_to_cygwin_bin};%PATH%  
set PATH={path_to_protoc.exe};%PATH%  
  1. Changed dir to the sources root folder (BUILDING.txt warns that there are some limitations on the path length so sources root should have the short name - I used D:\hds)
  2. Ran the building process:

    mvn package -Pdist -DskipTests

You can try without 'skipTests' but on my machine, some tests failed and the building was terminated. It may be connected to symbolic link issues mentioned in BUILDING .txt. 8. Picked the result in Hadoop-dist\target\hadoop-2.1.0-beta (windows executables and DLLs are in 'bin' folder)

Sep 11, 2019 in Big Data Hadoop by nitinrawat895
• 11,380 points
1,858 views

1 answer to this question.

0 votes

Same problem but with recent Hadoop v. 2.2.0. Here are steps for solving that problem:

  1. Build winutils.exe from sources. Project directory:

    hadoop-2.2.0-src\hadoop-common-project\hadoop-common\src\main\winutils

    My OS: Windows 7. Tool for building: MS Visual Studio Express 2013 for Windows Desktop (it's free and can be loaded from http://www.microsoft.com/visualstudio/). Open Studio, File -> Open -> winutils.sln. Right-click on the solution on the right side -> Build. You get winutils.exe - put it into Hadoop's bin.

  2. Next, we need to build hadoop.dll

    hadoop-2.2.0-src\hadoop-common-project\hadoop-common\src\main\native\native.sln

    in MS VS; right click on the solution -> build.

    https://github.com/jerishsd/hadoop-experiments/tree/master/sources

    hadoop-2.2.0-src\hadoop-common-project\hadoop-common\target\winutils\Debug\libwinutils.lib

    (the result of step # 1) into

    hadoop-2.2.0-src\hadoop-common-project\hadoop-common\target\bin

    And finally, build operation produces hadoop.dll! Put it again into Hadoop's bin and run name node

Hope these steps help

answered Sep 11, 2019 by ravikiran
• 4,620 points

Related Questions In Big Data Hadoop

0 votes
1 answer

How to upgrade Apache Hadoop from 2.4.1 to 2.6.0?

If the downtime is not an issue, ...READ MORE

answered Sep 7, 2018 in Big Data Hadoop by Frankie
• 9,830 points
450 views
0 votes
1 answer

How to Upgrade Apache Hadoop from 2.4.1 to 2.6.0

If the downtime is not an issue, ...READ MORE

answered Nov 30, 2018 in Big Data Hadoop by Frankie
• 9,830 points
1,252 views
0 votes
1 answer

How to run Hadoop in Docker containers?

Hi, You can run Hadoop in Docker container. Follow ...READ MORE

answered Jan 24, 2020 in Big Data Hadoop by MD
• 95,440 points
1,821 views
0 votes
7 answers

How to run a jar file in hadoop?

I used this command to run my ...READ MORE

answered Dec 10, 2018 in Big Data Hadoop by Dasinto
25,640 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
10,617 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
104,913 views
–1 vote
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,390 points
4,293 views
0 votes
1 answer
0 votes
1 answer

Explain to me how to get Hadoop configuration in Java Util.

In order to get access to the File ...READ MORE

answered May 24, 2019 in Big Data Hadoop by ravikiran
• 4,620 points
1,920 views
0 votes
1 answer
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP