6 What allows spark streaming to provide fault tolerance for network sources of data

0 votes
6)What allows spark streaming to provide fault tolerance for network sources of data?

a.re-computation of RDD lineage from the cache

b.replication of streamed data to  multiple worker node

c.Re-computation of RDD lineage from persisted RDDs

d.RDD is not lineage from RDD
Nov 23, 2020 in Apache Spark by ritu
• 960 points
2,104 views

1 answer to this question.

0 votes

Hi@ritu,

Fault tolerance is the property that enables a system to continue operating properly in the event of the failure of some of its components. Replication is the way to control the failure of the data. I think option B is the right answer.

answered Dec 1, 2020 by MD
• 95,440 points

Related Questions In Apache Spark

0 votes
0 answers

What allows spark to periodically persist data about an application such that it can recover from failures?

What allows spark to periodically persist data ...READ MORE

Nov 26, 2020 in Apache Spark by ritu
• 960 points

closed Nov 26, 2020 by MD 2,503 views
0 votes
1 answer
0 votes
1 answer

What are the levels of parallelism in spark streaming ?

> In order to reduce the processing ...READ MORE

answered Jul 27, 2018 in Apache Spark by zombie
• 3,790 points
4,470 views
0 votes
1 answer

4)Spark streaming converts streaming data into DStreams. which one of the given statements about DStreams is True?

Hi@ritu, Spark DStream (Discretized Stream) is the basic ...READ MORE

answered Nov 23, 2020 in Apache Spark by MD
• 95,440 points
2,363 views
0 votes
1 answer
0 votes
1 answer
0 votes
1 answer

When running Spark on Yarn, do I need to install Spark on all nodes of Yarn Cluster?

No, it is not necessary to install ...READ MORE

answered Jun 14, 2018 in Apache Spark by nitinrawat895
• 11,380 points
5,722 views
+1 vote
8 answers

How to print the contents of RDD in Apache Spark?

Save it to a text file: line.saveAsTextFile("alicia.txt") Print contains ...READ MORE

answered Dec 10, 2018 in Apache Spark by Akshay
60,765 views
0 votes
1 answer

16)What allows spark to periodically persist data about an application such that it can recover from failures?

Hi@Edureka, Checkpointing is a process of truncating RDD ...READ MORE

answered Nov 26, 2020 in Apache Spark by MD
• 95,440 points
1,653 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP