questions/big-data-hadoop
Hi@akhtar, You tried to read file from your ...READ MORE
Sqoop allows you to import the file ...READ MORE
Your doubt is quite an interesting one. Yes, ...READ MORE
Yes, there is a trash policy in ...READ MORE
You can not use column aliases in ...READ MORE
Hi@fwood, According to your configuration, you didn't set ...READ MORE
hadoop fs -stat is as hadoop command used ...READ MORE
Hey, To create a Sequential znode, add -s flag as shown ...READ MORE
Do a: sudo dpkg -l | grep hadoop to ...READ MORE
I will redirect you to a link ...READ MORE
--fields-terminated-by <char> READ MORE
Basically when we say Namespace we mean ...READ MORE
You can use these commands. For namenode: ./hadoop-daemon.sh start ...READ MORE
Follow these steps: A. Create Database ------------------ create database retail123; B. ...READ MORE
Hi@akhtar, You need to uninstall SSH and reinstall ...READ MORE
Actually dfs.data.dir and dfs.name.dir have to point ...READ MORE
First way is to use start-all.sh & ...READ MORE
Hive is a high-level language to analyze ...READ MORE
When you try to read a parquet ...READ MORE
The distributed copy command, distcp, is a ...READ MORE
First, copy data into HDFS. Then create ...READ MORE
A record is duplicate if there are ...READ MORE
In HDFS 8020 is IPC port, you ...READ MORE
Sequence files are binary files containing serialized ...READ MORE
Hey, Basically, with the following query, we can ...READ MORE
I use this code to get the ...READ MORE
Hi@akhtar, This error occurs when hive-shell started before ...READ MORE
Try this SELECT ID1, Sub FROM tableName lateral view ...READ MORE
Hi, Hadoop and hive have their individual commands. ...READ MORE
I will drop the answer in the ...READ MORE
hadoop.tmp.dir (A base for other temporary directories) is ...READ MORE
You can try this: CREATE TABLE temp ...READ MORE
You can't directly create a parquet table. ...READ MORE
The simplest way to check Hadoop version ...READ MORE
Hi@Raj, I think you need to provide permission ...READ MORE
"hdfs dfs -pwd" does not exist because ...READ MORE
Changing location requires 2 steps: 1.) Change location ...READ MORE
Hey, you can try something like this: df.write.partitionBy('year', ...READ MORE
Hey, Removes a specified znode and recursively all ...READ MORE
The use of dfs in your command is "Deprecated". ...READ MORE
Hi@akhtar, Here you have add the following properties ...READ MORE
#!/usr/bin/python from subprocess import Popen, PIPE cat = Popen(["hadoop", ...READ MORE
You have to do some configurations as ...READ MORE
Hey, Yes, now Hive supports IN or EXIST, ...READ MORE
When you executed your code earlier, you ...READ MORE
In Hadoop, Speculative Execution is a process ...READ MORE
If you have one avro file and ...READ MORE
Hey, First, you need to have a running ...READ MORE
Hi@Arun, By default, the parameter "dfs.datanode.failed.volumes.tolerated" is set ...READ MORE
make a dummy table which has at least one row. INSERT ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.