Converting CSV files to Parquet

0 votes
Hi Team,
Could you please me to solve the below scenario, I have incremental table stored in the CSV format, How can I convert it to Parquet format. Finally, output should be in parquet file format. Please help me with an example.
Jul 30 in Big Data Hadoop by Shri
439 views

1 answer to this question.

0 votes

--Create Hive external Table for existing data

CREATE EXTERNAL TABLE logs_csv
(
date_time string,
category string,
pdp_ip string,
pdp_port string,
dns_ip string,
cust_browsed_ip string,
country string
)
ROW FORMAT DELIMITED FIELDS TERMINATED BY ','
LOCATION 'hdfs://xxxx-xxxxxx/ftplogs';

msck repair table logs_csv;

-- Lets now create an external table to Parquet format

CREATE EXTERNAL TABLE logs_parquet (
date_time string,
category string,
pdp_ip string,
pdp_port string,
dns_ip string,
cust_browsed_ip string,
country string)
STORED AS PARQUET
LOCATION 'hdfs://xxxx-xxxxx/logsparquet';

--Time to convert and export. This step will run for a long time, depending on your data size and cluster size.

INSERT OVERWRITE TABLE logs_parquet SELECT date_time,category,pdp_ip,pdp_port,dns_ip,cust_browsed_ip,country FROM logs_csv
answered Jul 30 by Yogi

Related Questions In Big Data Hadoop

0 votes
1 answer

How to sync Hadoop configuration files to multiple nodes?

For syncing Hadoop configuration files, you have ...READ MORE

answered Jun 21, 2018 in Big Data Hadoop by HackTheCode
197 views
0 votes
5 answers
0 votes
1 answer

What are the various ways to import files into HDFS?

There are various tools and frameworks available ...READ MORE

answered Apr 13, 2018 in Big Data Hadoop by nitinrawat895
• 10,710 points
389 views
0 votes
1 answer

What is the standard way to create files in your hdfs file-system?

Well, it's so easy. Just enter the below ...READ MORE

answered Sep 22, 2018 in Big Data Hadoop by Frankie
• 9,810 points
123 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,710 points
3,300 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,710 points
391 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
16,244 views
0 votes
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,260 points
1,185 views
0 votes
1 answer

How to convert Spark data into CSV?

You can use this: df.write .option("header", "true") ...READ MORE

answered Nov 21, 2018 in Big Data Hadoop by Omkar
• 67,660 points
115 views
0 votes
1 answer

How to read HDFS and local files with the same code in Java?

You can try something like this: ​ ...READ MORE

answered Nov 22, 2018 in Big Data Hadoop by Omkar
• 67,660 points
729 views