questions/apache-spark
Hi@Prasant, If Spark Streaming is not supporting tuple, ...READ MORE
Option D) runtime error READ MORE
What allows spark to periodically persist data ...READ MORE
Option D: String class READ MORE
Hi@ritu, AWS has lots of services. For spark ...READ MORE
Hi@ritu, To start your python spark shell, you ...READ MORE
Option a) List(5,100,10) The take method returns the first n elements in an ...READ MORE
rror: expected class or object definition sc.parallelize(Array(1L,("SFO")),(2L,("ORD")),(3L,("DFW")))) ^ one error ...READ MORE
Hi@ritu, You can create a data frame from ...READ MORE
Hi@ritu, Spark's internal scheduler may truncate the lineage of the RDD graph if ...READ MORE
14)The number of stages in a job ...READ MORE
What is the output of the following ...READ MORE
Hi@Edureka, Checkpointing is a process of truncating RDD ...READ MORE
A Dataframe can be created from an ...READ MORE
Hi@Edureka, Spark's internal scheduler may truncate the lineage of the RDD graph ...READ MORE
Option c) Run time error - A READ MORE
error: expected class or object definition sc.parallelize (Array(1L, ...READ MORE
Hey, @Ritu, I am getting error in your ...READ MORE
After executing your code, there is an ...READ MORE
Hi@ritu, The most appropriate step according to me ...READ MORE
Option c) Mapr Jobs that are submitted READ MORE
What does the below code print? val AgeDs ...READ MORE
Option d) Run time error. READ MORE
Hi @Ritu If you want to see the ...READ MORE
17)from the given choices, identify the value ...READ MORE
Hi@ritu, You need to learn the Architecture of ...READ MORE
Hey, @Ritu, According to the question, the answer ...READ MORE
option d, Runtime error READ MORE
Hi, @Ritu, According to the official documentation of Spark 1.2, ...READ MORE
Hi@ritu, Fault tolerance is the property that enables ...READ MORE
Hi, @Ritu, option b for you, as Hash Partitioning ...READ MORE
Hi@ritu, Spark DStream (Discretized Stream) is the basic ...READ MORE
Hi@ritu, I think the problem can be solved ...READ MORE
Hi, @Ritu, List(5,100,10) is printed. The take method returns the first n elements in ...READ MORE
Hi, @Ritu, When creating a pair RDD from ...READ MORE
Hi@Ruben, I think you can add an escape ...READ MORE
Hi@akhtar, When we try to retrieve the data ...READ MORE
Hi@akhtar, Since Avro library is external to Spark, ...READ MORE
Hi, I am able to understand your requirement. ...READ MORE
Hi@dani, You can find the euclidean distance using ...READ MORE
Hi@dani, As you said you are a beginner ...READ MORE
Hi@Shllpa, In general, we get the 401 status code ...READ MORE
Hi@khyati, You are getting this type of output ...READ MORE
Hi@akhtar, You can write the spark dataframe in ...READ MORE
Hi@Srinath, It seems you didn't set Hadoop for ...READ MORE
Hi@Manas, You can read your dataset from CSV ...READ MORE
Hi@Neha, You can find all the job status ...READ MORE
Hi@Ganendra, I am not sure what's the issue, ...READ MORE
Hi@Ganendra, As you said you launched a multinode cluster, ...READ MORE
package com.dataguise.test; import java.io.IOException; import java.util.concurrent.CountDownLatch; import java.util.concurrent.TimeUnit; import org.apache.spark.SparkContext; import org.apache.spark.SparkJobInfo; import ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.