questions/big-data-hadoop
There is no jobtracker in hadoop 2.2.0 YARN framework. ...READ MORE
LocalFS means it may be your LinuxFS ...READ MORE
As per Cloudera, if you install hadoop ...READ MORE
You can perform an installation or upgrade ...READ MORE
Yes. A comprehensive set of APIs for ...READ MORE
Distributed Cache is an important feature provided ...READ MORE
The official location for Hadoop is the ...READ MORE
import commands hdir_list = commands.getoutput('hadoop fs -ls hdfs: ...READ MORE
#!/usr/bin/python from subprocess import Popen, PIPE cat = Popen(["hadoop", ...READ MORE
Try this: { Configuration config ...READ MORE
First make sure you have ant installed ...READ MORE
If by "using", you mean distributing it, ...READ MORE
You can use a combination of cat and put command. Something ...READ MORE
Make the following changes to the hadoop-env.sh ...READ MORE
In the command, try mentioning the driver ...READ MORE
You dont have to specify the file name ...READ MORE
The entries in your .bashrc file looks ...READ MORE
Try with below commands: hadoop fs -copyFromLocal <localsrc> ...READ MORE
The main difference between -cat and -text ...READ MORE
Seems like the output file already exists. ...READ MORE
Try the following command and see if ...READ MORE
Seems like it is running on default ...READ MORE
One of the big features of Hadoop/map-reduce ...READ MORE
You can use HDFS API like the ...READ MORE
Check if bin/start-all.sh doesn't override JAVA_HOME put echo ...READ MORE
SELECT a_id, b, c, count(*) as sumrequests FROM ...READ MORE
make a dummy table which has at least one row. INSERT ...READ MORE
I suggest you include below property files ...READ MORE
First, check the permissions of HDFS Directory ...READ MORE
If the downtime is not an issue, ...READ MORE
No, Data-Locality concept applies to MAPPERS only. Reducers ...READ MORE
I was able to fix the issue. ...READ MORE
It looks like the path is not ...READ MORE
You can do it using regexp_replace. This is ...READ MORE
I think you are using new version ...READ MORE
Yes. you can use the hadoop fsck command to do ...READ MORE
Try this: FileSystem fs = FileSystem.get(getConf()); fs.delete(new ...READ MORE
To get all the columns of a ...READ MORE
You can use the DESCRIBE command to ...READ MORE
I doubt if there is something which ...READ MORE
Problem has been solved by zipimport. Then I zip chardet to ...READ MORE
Is it common to see this sentence: ...READ MORE
The use of dfs in your command is "Deprecated". ...READ MORE
Name nodes: hdfs getconf -namenodes Secondary name nodes: hdfs getconf ...READ MORE
Run the command as sudo or add the ...READ MORE
Hey. You can use the following commands ...READ MORE
The most common reason for this is ...READ MORE
hadoop fs -cat /example2/doc1 | wc -l READ MORE
This is because you dont have enough ...READ MORE
Seems like hadoop path is missing in java.library.path. ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.