How to extract record from one RDD using another RDD

+1 vote

I have a input file who's data is as below:

1,Prashant,IT,Developer,27,8000
2,Saurav,Training,Trainer,38,18000
6,Vibhor,Training,Manager,40,20000
3,Vineet,Finance,Accountant,31,16000
5,Atul,IT,Manager,39,25000
7,Raman,Sales,Manager,43,23000
8,Rakesh,Finance,Manager,45,26000
4,Kartik,Sales,Salesman,29,11000

I need another RDD where the records are extracted if it contains word as Manager.

Can anyone please help me on this?

Aug 22 in Apache Spark by anonymous
75 views

1 answer to this question.

+1 vote

Hey, you can use "contains" filter to extract those lines. Something like this:

val newRDD = oldRDD.filter(line => line.contains(“Manager”))
answered Aug 23 by Karan

Related Questions In Apache Spark

0 votes
1 answer

How to save and retrieve the Spark RDD from HDFS?

You can save the RDD using saveAsObjectFile and saveAsTextFile method. ...READ MORE

answered May 29, 2018 in Apache Spark by Shubham
• 13,350 points
2,738 views
0 votes
1 answer

How to create RDD from parallelized collection in scala?

Hi, You can check this example in your ...READ MORE

answered Jul 3 in Apache Spark by Gitika
• 25,420 points
90 views
0 votes
0 answers

How to create RDD from existing RDD in scala?

Can anyone suggest how to create RDD ...READ MORE

Jul 3 in Apache Spark by Nihal
26 views
0 votes
1 answer

How to create RDD from an external file source in scala?

Hi, To create an RDD from external file ...READ MORE

answered Jul 3 in Apache Spark by Gitika
• 25,420 points
98 views
+1 vote
1 answer
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,760 points
3,522 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,760 points
433 views
+1 vote
11 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
17,918 views
0 votes
1 answer

How can I write a text file in HDFS not from an RDD, in Spark program?

Yes, you can go ahead and write ...READ MORE

answered May 29, 2018 in Apache Spark by Shubham
• 13,350 points
1,714 views
0 votes
7 answers

How to print the contents of RDD in Apache Spark?

Simple and easy: line.foreach(println) READ MORE

answered Dec 10, 2018 in Apache Spark by Kuber
13,811 views