questions/big-data-hadoop
import commands hdir_list = commands.getoutput('hadoop fs -ls hdfs: ...READ MORE
#!/usr/bin/python from subprocess import Popen, PIPE cat = Popen(["hadoop", ...READ MORE
Try this: { Configuration config ...READ MORE
First make sure you have ant installed ...READ MORE
If by "using", you mean distributing it, ...READ MORE
You can use a combination of cat and put command. Something ...READ MORE
Make the following changes to the hadoop-env.sh ...READ MORE
In the command, try mentioning the driver ...READ MORE
You dont have to specify the file name ...READ MORE
The entries in your .bashrc file looks ...READ MORE
Try with below commands: hadoop fs -copyFromLocal <localsrc> ...READ MORE
The main difference between -cat and -text ...READ MORE
Seems like the output file already exists. ...READ MORE
Try the following command and see if ...READ MORE
Seems like it is running on default ...READ MORE
One of the big features of Hadoop/map-reduce ...READ MORE
You can use HDFS API like the ...READ MORE
Check if bin/start-all.sh doesn't override JAVA_HOME put echo ...READ MORE
SELECT a_id, b, c, count(*) as sumrequests FROM ...READ MORE
make a dummy table which has at least one row. INSERT ...READ MORE
I suggest you include below property files ...READ MORE
First, check the permissions of HDFS Directory ...READ MORE
If the downtime is not an issue, ...READ MORE
No, Data-Locality concept applies to MAPPERS only. Reducers ...READ MORE
It looks like the path is not ...READ MORE
You can do it using regexp_replace. This is ...READ MORE
I think you are using new version ...READ MORE
Yes. you can use the hadoop fsck command to do ...READ MORE
Try this: FileSystem fs = FileSystem.get(getConf()); fs.delete(new ...READ MORE
You can use the DESCRIBE command to ...READ MORE
I doubt if there is something which ...READ MORE
Problem has been solved by zipimport. Then I zip chardet to ...READ MORE
The use of dfs in your command is "Deprecated". ...READ MORE
Name nodes: hdfs getconf -namenodes Secondary name nodes: hdfs getconf ...READ MORE
Run the command as sudo or add the ...READ MORE
Hey. You can use the following commands ...READ MORE
The most common reason for this is ...READ MORE
hadoop fs -cat /example2/doc1 | wc -l READ MORE
This is because you dont have enough ...READ MORE
Seems like hadoop path is missing in java.library.path. ...READ MORE
You can try something like this: ...READ MORE
Configured is a default implementation of the Configurable interface - ...READ MORE
Apache pig provides CSVExcelStorage class for loading ...READ MORE
Use org.apache.hive.jdbc.HiveDriver as your driver ...READ MORE
You can use the hadoop fs -ls command to ...READ MORE
Here's the solution that worked for me: <dependency> ...READ MORE
You can use this: df.write .option("header", "true") ...READ MORE
You can see the free available space ...READ MORE
The SerDe interface allows you to instruct ...READ MORE
Write Ahead Log (WAL) is a file ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.