Filtering a row in Spark DataFrame based on matching values from a list

0 votes

Announcement! Career Guide 2019 is out now. Explore careers to become a Big Data Developer or Architect!

I am filtering the Spark DataFrame using filter:

var notFollowingList=List(9.8,7,6,3,1)
df.filter(col("uid”).isin(notFollowingList))

But I get an error saying:

Unsupported literal type classscala.collection.immutable.$colon$colon 

Can anyone help me in resolving the error?

Jun 6, 2018 in Apache Spark by code799
91,813 views

3 answers to this question.

+1 vote

Use the function as following:

var notFollowingList=List(9.8,7,6,3,1)
df.filter(col("uid").isin(notFollowingList:_*))
You can even check out the details of a successful Spark developers with the Pyspark training course.
answered Jun 6, 2018 by Shubham
• 13,490 points
0 votes

You need isInCollection

answered Mar 30, 2020 by anonymous
0 votes

Is it possible to filter Spark DataFrames to return all rows where a , How can I return only the rows of a Spark DataFrame where the values for a column are within a specified list? Here's my Python pandas way of  How can I return only the rows of a Spark DataFrame where the values for a column are within a specified list? Here's my Python pandas way of doing this operation: df_start = df[df['name'].isin(['App Opened', 'App Launched'])].copy() I saw this SO scala implementation and tried several permutations, but couldn't get it to work.

answered Dec 14, 2020 by Gitika
• 65,910 points

Related Questions In Apache Spark

+1 vote
1 answer

getting null values in spark dataframe while reading data from hbase

Can you share the screenshots for the ...READ MORE

answered Jul 31, 2018 in Apache Spark by kurt_cobain
• 9,390 points
2,066 views
+1 vote
8 answers

How to replace null values in Spark DataFrame?

Hi, In Spark, fill() function of DataFrameNaFunctions class is used to replace ...READ MORE

answered Dec 15, 2020 in Apache Spark by MD
• 95,440 points
74,007 views
+1 vote
1 answer

How to read a data from text file in Spark?

Hey, You can try this: from pyspark import SparkContext SparkContext.stop(sc) sc ...READ MORE

answered Aug 6, 2019 in Apache Spark by Gitika
• 65,910 points
4,673 views
+1 vote
1 answer
+1 vote
2 answers
0 votes
3 answers

How to connect Spark to a remote Hive server?

JDBC is not required here. Create a hive ...READ MORE

answered Mar 8, 2019 in Big Data Hadoop by Vijay Dixon
• 190 points
12,074 views
0 votes
3 answers

How to transpose Spark DataFrame?

Please check the below mentioned links for ...READ MORE

answered Jan 1, 2019 in Apache Spark by anonymous
19,002 views
0 votes
1 answer

Different Spark Ecosystem

Spark has various components: Spark SQL (Shark)- for ...READ MORE

answered Jun 4, 2018 in Apache Spark by kurt_cobain
• 9,390 points
681 views
0 votes
2 answers

In a Spark DataFrame how can I flatten the struct?

// Collect data from input avro file ...READ MORE

answered Jul 4, 2019 in Apache Spark by Dhara dhruve
5,659 views
+1 vote
1 answer

How can I write a text file in HDFS not from an RDD, in Spark program?

Yes, you can go ahead and write ...READ MORE

answered May 29, 2018 in Apache Spark by Shubham
• 13,490 points
7,886 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP