How to import data in sqoop as a Parquet file?

0 votes
Hi. I am importing some data in sqoop using the sqoop import command. For my requirement, I want the imported file to be a Parquet file. How can I import the file as a Parquet file?
May 15 in Big Data Hadoop by Janan
639 views

1 answer to this question.

0 votes

Sqoop allows you to import the file as different files. To import the file as a Parquet file, use the --as-parquetfile switch along with your sqoop import command. 

$ sqoop import <options> --as-parquetfile

And just so you know, you can also import into other file formats as mentioned below

File Type Switch
Avro Data --as-avrodatafile
Sequence File --as-sequencefile
Text File --as-textfile

answered May 15 by Nanda

Related Questions In Big Data Hadoop

0 votes
1 answer

How to create a parquet table in hive and store data in it from a hive table?

Please use the code attached below for ...READ MORE

answered Jan 28 in Big Data Hadoop by Omkar
• 67,660 points
3,034 views
0 votes
7 answers

How to run a jar file in hadoop?

I used this command to run my ...READ MORE

answered Dec 10, 2018 in Big Data Hadoop by Dasinto
6,093 views
0 votes
1 answer

How to print the content of a file in console present in HDFS?

Yes, you can use hdfs dfs command ...READ MORE

answered Apr 19, 2018 in Big Data Hadoop by Shubham
• 13,310 points
662 views
0 votes
1 answer

How to count lines in a file on hdfs command?

Use the below commands: Total number of files: hadoop ...READ MORE

answered Aug 10, 2018 in Big Data Hadoop by Neha
• 6,280 points
3,644 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,730 points
3,365 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,730 points
404 views
0 votes
10 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
16,684 views
0 votes
1 answer

Hadoop dfs -ls command?

In your case there is no difference ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by kurt_cobain
• 9,260 points
1,218 views
0 votes
1 answer

How to append data to a parquet file?

Try using Spark API to append the ...READ MORE

answered Jan 11 in Big Data Hadoop by Omkar
• 67,660 points
1,364 views
0 votes
1 answer

How to use data compression in sqoop import?

You can enable data compression from the ...READ MORE

answered May 15 in Big Data Hadoop by Rocky
202 views