Is there any way to uncache RDD?

0 votes
I used cache() to cache the data in memory but I realized to see the performance without cached data I need to uncache it to remove data from memory:

rdd.cache();
//doing some computation
...
rdd.uncache()
but I got the error said:

value uncache is not a member of org.apache.spark.rdd.RDD[(Int, Array[Float])]

I don't know how to do the uncache then!
May 30, 2018 in Apache Spark by kurt_cobain
• 9,240 points
105 views

1 answer to this question.

0 votes

RDD can be uncached using unpersist()

So. use rdd.unpersist()

answered May 30, 2018 by nitinrawat895
• 10,670 points

Related Questions In Apache Spark

0 votes
1 answer

How to remove the elements with a key present in any other RDD?

Hey, You can use the subtractByKey () function to ...READ MORE

answered Jul 22 in Apache Spark by Gitika
• 25,340 points
56 views
0 votes
1 answer

what is Paired RDD and how to create paired RDD in Spark?

Hi, Paired RDD is a distributed collection of ...READ MORE

answered Aug 2 in Apache Spark by Gitika
• 25,340 points
59 views
0 votes
1 answer

Efficient way to read specific columns from parquet file in spark

As parquet is a column based storage ...READ MORE

answered Apr 20, 2018 in Apache Spark by kurt_cobain
• 9,240 points
1,221 views
0 votes
1 answer

Can anyone explain what is RDD in Spark?

RDD is a fundamental data structure of ...READ MORE

answered May 24, 2018 in Apache Spark by Shubham
• 13,290 points
581 views
+1 vote
1 answer
0 votes
1 answer

Writing File into HDFS using spark scala

The reason you are not able to ...READ MORE

answered Apr 5, 2018 in Big Data Hadoop by kurt_cobain
• 9,240 points
5,270 views
0 votes
1 answer

What's the difference between 'filter' and 'where' in Spark SQL?

Both 'filter' and 'where' in Spark SQL ...READ MORE

answered May 23, 2018 in Apache Spark by nitinrawat895
• 10,670 points
5,758 views
0 votes
1 answer

How to find max value in pair RDD?

Use Array.maxBy method: val a = Array(("a",1), ("b",2), ...READ MORE

answered May 25, 2018 in Apache Spark by nitinrawat895
• 10,670 points
2,230 views
0 votes
1 answer

Is there any way to check the Spark version?

There are 2 ways to check the ...READ MORE

answered Apr 19, 2018 in Apache Spark by nitinrawat895
• 10,670 points
1,148 views
0 votes
1 answer

Is it better to have one large parquet file or lots of smaller parquet files?

Ideally, you would use snappy compression (default) ...READ MORE

answered May 23, 2018 in Apache Spark by nitinrawat895
• 10,670 points
1,886 views