questions/big-data-hadoop
Spark has much lower per job and ...READ MORE
Follow these steps: STEP 1 : stop hadoop ...READ MORE
I have set up a multi-node Hadoop ...READ MORE
Hey, The key difference between both the components ...READ MORE
Refer to this example: Step1: Check table test1 ...READ MORE
This error is thrown when the parameters ...READ MORE
--Create Hive external Table for existing data CREATE ...READ MORE
Try using the below build.sbt, code.scala and command ...READ MORE
This particular exception is related to Hive logs. ...READ MORE
Your doubt is quite an interesting one. Yes, ...READ MORE
Refer to the below when you are ...READ MORE
Your data node is not running that ...READ MORE
Suppose you want to kill the jobs ...READ MORE
The tables look fine. Follow these steps ...READ MORE
-Put and -copyFromLocal is almost same command ...READ MORE
Each reducer uses an OutputFormat to write ...READ MORE
InputSplits are created by logical division of ...READ MORE
Hive is a data warehouse infrastructure tool ...READ MORE
Below is an example query which you ...READ MORE
Both codes contain different API of Map ...READ MORE
Hi, We can use a normal insert query ...READ MORE
It's because that is the syntax. This ...READ MORE
A value with a wrong datatype causes ...READ MORE
You can use the SUBSTR() in hive ...READ MORE
How to exclude tables in sqoop if ...READ MORE
For avro you can follow the format ...READ MORE
You can use this: import org.apache.spark.sql.functions.struct val df = ...READ MORE
After creating the tables a1 and b1 ...READ MORE
You have to do some configurations as ...READ MORE
Each file Schema = 150bytes Block schema ...READ MORE
Try this instead: select from_unixtime(unix_timestamp()); If you have an ...READ MORE
FileInputFormat : Base class for all file-based InputFormats Other ...READ MORE
Here's a list of Input Formats: CombineFileInputFormat CombineS ...READ MORE
The command you are typing is incorrect. ...READ MORE
You can do it using the following ...READ MORE
It is straight forward and you can achieve ...READ MORE
It is very straight forward, no need ...READ MORE
With MR2, now we should set conf.set("mapreduce.map.output.compress", true) conf.set("mapreduce.output.fileoutputformat.compress", ...READ MORE
I would prefer you to download million songs ...READ MORE
i need to write some mapreduce pattern ...READ MORE
To get full query running for the ...READ MORE
Enter the below command in the terminal ...READ MORE
Unfortunately, this can't be achieved with open ...READ MORE
Please make sure you connect to spark2-shell ...READ MORE
First you have to have the file ...READ MORE
You need to solve the issue which ...READ MORE
For the above requirement, the memory consumption ...READ MORE
You can use the following code: A = ...READ MORE
You can use the following sample code for ...READ MORE
Hi, You can do one thing, first delete all ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.