If we load the data in hive table using below below steps.
1.) create a hive table;
2.) load the hive table;
3.) create hive table for parquet;
4.) load hive parquet table from hive table;
Will the file be a normal .dat file in hive table and parquet file for hive parquet table and cannot be read using hdfs dfs -cat command?
When you try to read a parquet file using cat command, it will appear as below:
Therefore, the parquet file will not be in a readable format.
Actually dfs.data.dir and dfs.name.dir have to point ...READ MORE
The reason you are not able to ...READ MORE
You are getting this error because the ...READ MORE
Use the below commands:
Total number of files: hadoop ...READ MORE
du command is used for to see ...READ MORE
Hadoop put & appendToFile only reads standard ...READ MORE
You can use dfsadmin which runs a ...READ MORE
hdfs dfsadmin -report
This command tells fs ...READ MORE
Please make sure you connect to spark2-shell ...READ MORE
You can simply run the following commands ...READ MORE
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.