questions/apache-spark
To enable monitoring interrupted tasks, run the following ...READ MORE
spark-submit \ class org.apache.spark.examples.SparkPi \ deploy-mode client \ master spark//$SPARK_MASTER_IP:$SPARK_MASTER_PORT ...READ MORE
Preparing for an interview? We have something ...READ MORE
The default interval time is 1800 seconds ...READ MORE
Hi, Traits are basically Scala's workaround for the ...READ MORE
Hi, No. An RDD is made up of ...READ MORE
option d, Runtime error READ MORE
Hi, You can use two level loops using the ...READ MORE
You can use dynamic configuration setting to ...READ MORE
Hi@akhtar There are lots of online courses available ...READ MORE
You can dynamically set a password to ...READ MORE
Hey, Used with a loop, yield produces a value for ...READ MORE
Hi, Yield keyword can be used either before ...READ MORE
Yes, it is possible to run Spark ...READ MORE
Hi, @Ritu, List(5,100,10) is printed. The take method returns the first n elements in ...READ MORE
Hi@akhtar, There is no concept of indexing in ...READ MORE
Hey, To format a string, use the .format ...READ MORE
Hi, You can use this command to start ...READ MORE
Hi, Spark’s RDDs are by default recomputed each ...READ MORE
Hey, Scala allows the definition of a higher-order function. These ...READ MORE
Hey, Recursion is when a function makes a ...READ MORE
You can set the maximum number of ...READ MORE
To get command prompt for Scala open ...READ MORE
You have to use "===" instead of ...READ MORE
Hi, The transformations are the functions that are ...READ MORE
Hey, The range() method will give us integers ...READ MORE
Hey, A BitSet is a set of non-negative ...READ MORE
Hi, We can declare Scala arrays in two ...READ MORE
Hi, Scala uses immutability by default in most ...READ MORE
There is a property of Spark which ...READ MORE
Spark dashboard by default runs on port ...READ MORE
You can set the port in the ...READ MORE
println("Slayer") is an anonymous block and gets ...READ MORE
There's a heartbeat signal sent to the ...READ MORE
SparkContext.createTaskScheduler property parses the master parameter Local: 1 ...READ MORE
By default, 1000 batches are retained by ...READ MORE
For spark.read.textFile we need spark-2.x. Please try ...READ MORE
Hi, You can use for loop in scala using ...READ MORE
By default, the cleanup time is set ...READ MORE
1) Use the concat() function. Refer to the below ...READ MORE
You can run the Spark shell for ...READ MORE
GraphX is the Spark API for graphs and ...READ MORE
Hi, Spark ecosystem libraries are composed of various ...READ MORE
SQL Interpreter & Optimizer handles the functional ...READ MORE
Hey, Scala uses the method copy() to carry ...READ MORE
There another property where you can set ...READ MORE
How many spark context objects you should ...READ MORE
Hi, Hive contains significant support for Apache Spark, ...READ MORE
Hey, The app is a helper class that ...READ MORE
You can do this by running the ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.