questions/big-data-hadoop
The Secondary namenode is mainly used as a ...READ MORE
Hi @judy, It is possible to Simplifying Hadoop deployments ...READ MORE
Maybe you don't have the install point ...READ MORE
Hey, This Hadoop fs command appends single sources ...READ MORE
Hey, Hive subquery is a select expression enclosed ...READ MORE
Hi, You need to create your own customize ...READ MORE
Suppose you need to load this in ...READ MORE
Yes, you can. In fact many people ...READ MORE
Name nodes: hdfs getconf -namenodes Secondary name nodes: hdfs getconf ...READ MORE
The issue that you might be getting ...READ MORE
First you have to have the file ...READ MORE
In hadoop v2.7.1, if you open the ...READ MORE
Try this instead: select from_unixtime(unix_timestamp()); If you have an ...READ MORE
COUNT is part of pig LOGS= LOAD 'log'; LOGS_GROUP= ...READ MORE
Multiple files are not stored in a ...READ MORE
Hey, The suspend option suspends a workflow job in RUNNING status. After ...READ MORE
This command should be executed in MySQL ...READ MORE
Hey, Data in Hive tables reside on HDFS, ...READ MORE
Hi@akhtar, Before starting Hbase Shell, make sure you ...READ MORE
To find this file, your HADOOP_CONF_DIR env ...READ MORE
Hey, Yes, Hbase is known to be a ...READ MORE
Hi@akhtar, Here you have to add the below ...READ MORE
Hi. You can connect Sqoop to MySql ...READ MORE
Hey, As the error suggested that you have ...READ MORE
job.setOutputValueClass will set the types expected as ...READ MORE
Hey, Creating children is similar to creating new ...READ MORE
Hello @Hemanth, the error String or binary data ...READ MORE
Hi, Yes, you can do it by using ...READ MORE
I would recommend you to use FileSystem.rename(). ...READ MORE
The NameNode and the JobTracker are single ...READ MORE
Seems like hadoop path is missing in java.library.path. ...READ MORE
Hi@sivachandran, You can monitor the metrics of your ...READ MORE
Hi@akhtar, This type of error indicates a write ...READ MORE
You can do the following method, copy to ...READ MORE
import commands hdir_list = commands.getoutput('hadoop fs -ls hdfs: ...READ MORE
Please run below mentioned command. It will ...READ MORE
Definition: HUE is an open-source SQL Cloud Editor, ...READ MORE
The problem is with the dependencies. The ...READ MORE
You can use the hdfs command: hdfs fs ...READ MORE
Hi@Som, This error is not related to Prometheus ...READ MORE
Outer Bag: An outer bag is nothing but ...READ MORE
Try this: val text = sc.wholeTextFiles("student/*") text.collect() ...READ MORE
Hi, I can see the error arose because the count ...READ MORE
We have to use Sqoop-HCatalog Integration here. ...READ MORE
You can spolve this by adding below ...READ MORE
Hi. This is the code I used ...READ MORE
Well, it's so easy. Just enter the below ...READ MORE
Hi@akhtar, To configure the Hadoop Client, you need ...READ MORE
I think you have upgraded CDH. This ...READ MORE
Spark is capable to Perform in three ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.