questions/big-data-hadoop
First, enable the detection of slow nodes. ...READ MORE
The hadoop command is not present in ...READ MORE
Hello All, need assistance on one issue. ...READ MORE
I use this code to get the ...READ MORE
I'm using command java -classpath ${HADOOP_CLASSPATH} -d '/home/shravanth/Desktop/wordcount/classes' ...READ MORE
This error is caused when hadoop tries ...READ MORE
Name Node is a primary node in ...READ MORE
can someone help me with this please, ...READ MORE
Hello @Hemanth, the error String or binary data ...READ MORE
You can use a checksum to compare ...READ MORE
READ MORE
this is my input file output_pig_group_education_comma/input_load.txt and its ...READ MORE
The reason why you get this error ...READ MORE
The reason for this error is because ...READ MORE
Make the following changes to your configuration ...READ MORE
Hello All, I am new to hadoop, i ...READ MORE
1. In order to merge two or ...READ MORE
Hi, To start with Hadoop I would suggest ...READ MORE
Pls help on below error..on MAC OS WARN ...READ MORE
Hey, Yes, Hive supports LIKE operator, but it ...READ MORE
Hey! The error seems like the problem is ...READ MORE
Hi, Looking into your error I can say ...READ MORE
First of all, let understand what is ...READ MORE
Hey, We use store command to store the ...READ MORE
Hey Rahul, You have mentioned that you have no ...READ MORE
HDFS is a block structured file system ...READ MORE
Let's start from scratch. Hadoop basically consists of three ...READ MORE
Hi, You can download all the versions you ...READ MORE
Hi. Here's what you need. Follow the below ...READ MORE
Download Sqoop from this link: http://www.apache.org/dyn/closer.lua/sqoop/1.4.7 This link is ...READ MORE
$ sqoop job --delete jobname But be careful ...READ MORE
You can use the import-all-tables option and along with ...READ MORE
Hi @lucky! The latest version of Sqoop available ...READ MORE
Try using Spark API to append the ...READ MORE
The Cloudera Connect Partner Program, more than ...READ MORE
Try these steps (make necessary changes): First upload ...READ MORE
Please refer to the below commands: student = ...READ MORE
The process to perform incremental data load ...READ MORE
I executed the same code and it ...READ MORE
First list the running services: chkconfig --list And then ...READ MORE
Follow these steps: Stop namenode Delete the datanode directory ...READ MORE
import commands hdir_list = commands.getoutput('hadoop fs -ls hdfs: ...READ MORE
#!/usr/bin/python from subprocess import Popen, PIPE cat = Popen(["hadoop", ...READ MORE
SELECT a_id, b, c, count(*) as sumrequests FROM ...READ MORE
The use of dfs in your command is "Deprecated". ...READ MORE
You can try something like this: ...READ MORE
You can not use column aliases in ...READ MORE
You can use a Writable, something like ...READ MORE
It is now possible to insert like ...READ MORE
Default scheduler in hadoop is JobQueueTaskScheduler, which is ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.