questions/big-data-hadoop
Your doubt is quite an interesting one. Yes, ...READ MORE
Hey, To create a Sequential znode, add -s flag as shown ...READ MORE
Yes, there is a trash policy in ...READ MORE
You can not use column aliases in ...READ MORE
Hi@akhtar, You need to set JAVA_HOME and HADOOP_HOME ...READ MORE
I will redirect you to a link ...READ MORE
Do a: sudo dpkg -l | grep hadoop to ...READ MORE
--fields-terminated-by <char> READ MORE
Hi@fwood, According to your configuration, you didn't set ...READ MORE
hadoop fs -stat is as hadoop command used ...READ MORE
Follow these steps: A. Create Database ------------------ create database retail123; B. ...READ MORE
The distributed copy command, distcp, is a ...READ MORE
Hi@akhtar, You need to uninstall SSH and reinstall ...READ MORE
First, copy data into HDFS. Then create ...READ MORE
Actually dfs.data.dir and dfs.name.dir have to point ...READ MORE
Basically when we say Namespace we mean ...READ MORE
Hi@akhtar, You can rename the table name in ...READ MORE
First way is to use start-all.sh & ...READ MORE
When you try to read a parquet ...READ MORE
A record is duplicate if there are ...READ MORE
Hive is a high-level language to analyze ...READ MORE
Sequence files are binary files containing serialized ...READ MORE
You can use these commands. For namenode: ./hadoop-daemon.sh start ...READ MORE
In HDFS 8020 is IPC port, you ...READ MORE
Hey, Basically, with the following query, we can ...READ MORE
I will drop the answer in the ...READ MORE
Try this SELECT ID1, Sub FROM tableName lateral view ...READ MORE
You can try this: CREATE TABLE temp ...READ MORE
I use this code to get the ...READ MORE
hadoop.tmp.dir (A base for other temporary directories) is ...READ MORE
You can't directly create a parquet table. ...READ MORE
Hi@akhtar, This error occurs when hive-shell started before ...READ MORE
Hey, you can try something like this: df.write.partitionBy('year', ...READ MORE
The simplest way to check Hadoop version ...READ MORE
#!/usr/bin/python from subprocess import Popen, PIPE cat = Popen(["hadoop", ...READ MORE
"hdfs dfs -pwd" does not exist because ...READ MORE
Hey, Removes a specified znode and recursively all ...READ MORE
Changing location requires 2 steps: 1.) Change location ...READ MORE
If you have one avro file and ...READ MORE
Hi@akhtar, Here you have add the following properties ...READ MORE
In Hadoop, Speculative Execution is a process ...READ MORE
When you executed your code earlier, you ...READ MORE
You have to do some configurations as ...READ MORE
Hi@Raj, I think you need to provide permission ...READ MORE
Hey, Yes, now Hive supports IN or EXIST, ...READ MORE
Hey, First, you need to have a running ...READ MORE
The behaviour that you are seeing is ...READ MORE
Hi@Arun, By default, the parameter "dfs.datanode.failed.volumes.tolerated" is set ...READ MORE
The use of dfs in your command is "Deprecated". ...READ MORE
make a dummy table which has at least one row. INSERT ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.