Converting CSV files to Parquet

0 votes
Hi Team,
Could you please me to solve the below scenario, I have incremental table stored in the CSV format, How can I convert it to Parquet format. Finally, output should be in parquet file format. Please help me with an example.
Jul 30, 2019 in Big Data Hadoop by Shri
6,872 views

1 answer to this question.

0 votes

--Create Hive external Table for existing data

CREATE EXTERNAL TABLE logs_csv
(
date_time string,
category string,
pdp_ip string,
pdp_port string,
dns_ip string,
cust_browsed_ip string,
country string
)
ROW FORMAT DELIMITED FIELDS TERMINATED BY ','
LOCATION 'hdfs://xxxx-xxxxxx/ftplogs';

msck repair table logs_csv;

-- Lets now create an external table to Parquet format

CREATE EXTERNAL TABLE logs_parquet (
date_time string,
category string,
pdp_ip string,
pdp_port string,
dns_ip string,
cust_browsed_ip string,
country string)
STORED AS PARQUET
LOCATION 'hdfs://xxxx-xxxxx/logsparquet';

--Time to convert and export. This step will run for a long time, depending on your data size and cluster size.

INSERT OVERWRITE TABLE logs_parquet SELECT date_time,category,pdp_ip,pdp_port,dns_ip,cust_browsed_ip,country FROM logs_csv
answered Jul 30, 2019 by Yogi

Related Questions In Big Data Hadoop

0 votes
1 answer

How to sync Hadoop configuration files to multiple nodes?

For syncing Hadoop configuration files, you have ...READ MORE

answered Jun 21, 2018 in Big Data Hadoop by HackTheCode
1,183 views
0 votes
5 answers
0 votes
1 answer

What are the various ways to import files into HDFS?

There are various tools and frameworks available ...READ MORE

answered Apr 13, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
3,250 views
0 votes
1 answer

What is the standard way to create files in your hdfs file-system?

Well, it's so easy. Just enter the below ...READ MORE

answered Sep 23, 2018 in Big Data Hadoop by Frankie
• 9,830 points
2,345 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
10,617 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,215 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
104,910 views
–1 vote
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,390 points
4,293 views
0 votes
1 answer

How to convert Spark data into CSV?

You can use this: df.write .option("header", "true") ...READ MORE

answered Nov 21, 2018 in Big Data Hadoop by Omkar
• 69,210 points
4,003 views
+1 vote
1 answer

How to read HDFS and local files with the same code in Java?

You can try something like this: ​ ...READ MORE

answered Nov 22, 2018 in Big Data Hadoop by Omkar
• 69,210 points
4,480 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP