Not able to read the file from hdfs location using spark. But getting the error after running this line
val ordersRDD=sc.textFile("/user/reshma/sqoop_import/retail_db/orders ")
org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: file:/user/reshma/sqoop_import/retail_db/orders
You have to mention the hdfs path, but you are missing it out.
For your case, you might change the command to the below:
val RDD = sc,textFile("hdfs:///user/reshma/sqoop_import/retail_db/orders")
Please make sure you connect to spark2-shell instead of spark-shell.
Once you get spark shell. The following code should work -
val ordersRDD = sc.textFile("/user/reshma/sqoop_import/retail_db/orders.csv")
You dont have to specify the file name ...READ MORE
There are two possible ways to copy ...READ MORE
Well, the reason you are getting such ...READ MORE
The reason for this error is that ...READ MORE
Firstly you need to understand the concept ...READ MORE
org.apache.hadoop.mapred is the Old API
org.apache.hadoop.mapreduce is the ...READ MORE
put <localSrc> <dest>
copyF ...READ MORE
In your case there is no difference ...READ MORE
To load data from HDFS to pig ...READ MORE
There is a process or steps you ...READ MORE
Already have an account? Sign in.