questions/apache-spark
Hi, If we omit an argument in a ...READ MORE
Hey, A Scala function involves recursion when it ...READ MORE
Hi, To create a Scala function, we use ...READ MORE
Hi, A stream is a lazy list as ...READ MORE
Hey, A BitSet is a set of non-negative ...READ MORE
Hi, When a function experiences an exception, it ...READ MORE
You can do this using globbing. See ...READ MORE
df = spark.createDataFrame([("A", 2000), ("A", 2002), ("A", ...READ MORE
If you are trying to load file ...READ MORE
Check the reference code mentioned below: def main(args: ...READ MORE
Seems like you have not started the ...READ MORE
Hey, You can use this command to start ...READ MORE
Hey, With varargs, we can pass a variable ...READ MORE
Hey, The app is a helper class that ...READ MORE
Hey, In this language, val is a value and var is ...READ MORE
Hey, ofDim() is a method in Scala that ...READ MORE
Hi, Scala uses immutability by default in most ...READ MORE
Hi, Scala library has purely functional data structures ...READ MORE
Hey, Unit is a subtype of scala.anyval and ...READ MORE
Try this: val df = sc.textFile("HDFS://nameservice1/user/edureka_168049/Structure_IT/samplefile.txt") df.collect() val df = ...READ MORE
Hi, This happens in Scala whenever you won't ...READ MORE
The missing driver is the JDBC one ...READ MORE
Refer to the following code: val sqlContext = ...READ MORE
Refer to the below code: import org.apache.hadoop.conf.Configuration import org.apache.hadoop.fs.FileSystem import ...READ MORE
You can access task information using TaskContext: import org.apache.spark.TaskContext sc.parallelize(Seq[Int](), ...READ MORE
Hey, To install SBT on Ubuntu first you need ...READ MORE
You have to use sqoop to export data ...READ MORE
Start spark shell using below line of ...READ MORE
There's an easier way to achieve your ...READ MORE
1) Use the concat() function. Refer to the below ...READ MORE
Hey, You can see this following code to ...READ MORE
Hey, There are few methods provided by the ...READ MORE
Hey, You can try this code to get ...READ MORE
Hey, You need to follow some steps to complete ...READ MORE
Hey, I guess the only problem with the ...READ MORE
In the above statement, x(2) is specifying an array ...READ MORE
All prefix operators' symbols are predefined: +, -, ...READ MORE
This should work: def readExcel(file: String): DataFrame = ...READ MORE
Hi, You can compute the average using this ...READ MORE
Hey, In Apache Spark, the data storage model is ...READ MORE
Hey, You can use the subtractByKey () function to ...READ MORE
spark.read.csv is used when loading into a ...READ MORE
The reason you are able to load ...READ MORE
Yes, we can work with Avro files ...READ MORE
You can do it using a code ...READ MORE
I used Spark 1.5.2 with Hadoop 2.6 ...READ MORE
If you just want to get your ...READ MORE
Did you find any documents or example ...READ MORE
The HDFS path for MyLab is /user/edureka_id. ...READ MORE
Hello, From the error I get that the ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.