questions/apache-spark
Hi@ritu, The most appropriate step according to me ...READ MORE
Option c) Mapr Jobs that are submitted READ MORE
Option d) Run time error. READ MORE
Hi @Ritu If you want to see the ...READ MORE
Hi@ritu, You need to learn the Architecture of ...READ MORE
Hey, @Ritu, According to the question, the answer ...READ MORE
option d, Runtime error READ MORE
Hi, @Ritu, According to the official documentation of Spark 1.2, ...READ MORE
Hi@ritu, Fault tolerance is the property that enables ...READ MORE
Hi@ritu, Spark DStream (Discretized Stream) is the basic ...READ MORE
Hi@ritu, I think the problem can be solved ...READ MORE
Hi, @Ritu, List(5,100,10) is printed. The take method returns the first n elements in ...READ MORE
Hi, @Ritu, When creating a pair RDD from ...READ MORE
Hi@Ruben, I think you can add an escape ...READ MORE
Hi@akhtar, When we try to retrieve the data ...READ MORE
Hi@akhtar, Since Avro library is external to Spark, ...READ MORE
Hi, I am able to understand your requirement. ...READ MORE
Hi@dani, You can find the euclidean distance using ...READ MORE
Hi@dani, As you said you are a beginner ...READ MORE
Hi@Shllpa, In general, we get the 401 status code ...READ MORE
Hi@khyati, You are getting this type of output ...READ MORE
Hi@akhtar, You can write the spark dataframe in ...READ MORE
Hi@Srinath, It seems you didn't set Hadoop for ...READ MORE
Hi@Manas, You can read your dataset from CSV ...READ MORE
Hi@Neha, You can find all the job status ...READ MORE
Hi@Ganendra, I am not sure what's the issue, ...READ MORE
Hi@Ganendra, As you said you launched a multinode cluster, ...READ MORE
Hi, You can follow the below-given steps to ...READ MORE
Hi@Rishi, Yes, number of spark tasks can be ...READ MORE
Hi@Rishi, Yes, it is possible. If executor no. ...READ MORE
Hi@abdul, Hadoop 3.0.1 has lots of new features. ...READ MORE
Hi, @Amey, You can go through this regarding ...READ MORE
Hey, @KK, You can fix this issue may be ...READ MORE
Hi@akhtar There are lots of online courses available ...READ MORE
Hi@Deepak, In your test class you passed empid ...READ MORE
Hi@Amey, You can enable WebHDFS to do this ...READ MORE
Hi@Amey, It depends on your use case. Both ...READ MORE
Hi@akhtar, I think your HDFS cluster is not ...READ MORE
Hi@akhtar, In your error, it shows that you ...READ MORE
Hi@akhtar, To convert pyspark dataframe into pandas dataframe, ...READ MORE
Hi@akhtar, To import this module in your program, ...READ MORE
Hi@akhtar, By default pyspark in not present in ...READ MORE
Hi@akhtar, You may resolve this exception, by increasing the ...READ MORE
This type of error tends to occur ...READ MORE
Hi@akhtar, In /etc/spark/conf/spark-defaults.conf, append the path of your custom ...READ MORE
Hi@akhtar, I think you got this error due to version mismatch ...READ MORE
Hi, Use this below given code, it will ...READ MORE
Hi@akhtar, This error occurs because your python version ...READ MORE
from pyspark.sql.types import FloatType fname = [1.0,2.4,3.6,4.2,45.4] df=spark.createDataFrame(fname, ...READ MORE
Hi@akhtar, To create multiple producer you have to ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.