questions/apache-spark
Hi, SparkSQL is a special component on the ...READ MORE
Hi, Spark provides a pipe() method on RDDs. ...READ MORE
Hi, persist () allows the user to specify ...READ MORE
Hi, The transformations are the functions that are ...READ MORE
Can anyone suggest when we create an ...READ MORE
SparkContext sets up internal services and establishes ...READ MORE
Hey, It takes a function that operates on two ...READ MORE
Hi, Spark’s RDDs are by default recomputed each ...READ MORE
Hey, Parquet is a columnar format file supported ...READ MORE
Function Definition : def test():Unit{ var a=10 var b=20 var c=a+b } calling ...READ MORE
Hi, You can use a simple mathematical calculation ...READ MORE
Hi, Spark ecosystem libraries are composed of various ...READ MORE
Hi, Spark provides a high-level API in Java, ...READ MORE
Hi, RDD in spark stands for REsilient distributed ...READ MORE
The statement display(id, name, salary) is written before the display function ...READ MORE
Please refer to the below code as ...READ MORE
You can try this: object printarray { ...READ MORE
For spark.read.textFile we need spark-2.x. Please try ...READ MORE
Please have a look below for your ...READ MORE
Please check https://kb.databricks.com/streaming/file-sink-str ...READ MORE
To get command prompt for Scala open ...READ MORE
There seems to be a problem with ...READ MORE
Variable declaration can be done in two ...READ MORE
peopleDF: org.apache.spark.sql.DataFrame = [_corrupt_record: string] The above that ...READ MORE
Please try the following Scala code: import org.apache.hadoop.conf.Configuration import ...READ MORE
Run below commands spark-class org.apache.spark.deploy.master.Master spark-class org.apache.spark.deploy.worker.Worker spark://192.168.254.1:7077 NOTE: The ...READ MORE
If you need a single output file ...READ MORE
Hi All I am running Scala program on ...READ MORE
Hey there! You can use the select method of the ...READ MORE
You can run the Spark shell for ...READ MORE
Try this and see if this does ...READ MORE
By default, the maximum number of times ...READ MORE
You can make use of Special Library path to ...READ MORE
You cans set extra JVM options that ...READ MORE
This is because the maximum number of ...READ MORE
First, store upload this archive to hdfs and ...READ MORE
To change the default queue to which ...READ MORE
Open Spark shell and run the following ...READ MORE
In case Yarn does not support schemes ...READ MORE
You have to specify a comma-separated list ...READ MORE
I don't think you can copy and ...READ MORE
If you are running history server and ...READ MORE
By default, Spark jar, app jar, and ...READ MORE
Hi @Raunak. You can change the replication ...READ MORE
The default time that the Yarn application waits ...READ MORE
By default, only one core is used for ...READ MORE
You can increase the memory dynamically by ...READ MORE
By default, the cleanup time is set ...READ MORE
The default interval time is 1800 seconds ...READ MORE
To enable cleanup, open the spark shell ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.