No not mandatory, but there is no separate storage in Spark, so it uses the local file system to store the data. You can load data from the local system and process it, Hadoop or HDFS is not mandatory to run spark application. But if your program requires you to use HDFS then starting the HDFS or starting Hadoop is necessary.