Whenever a node goes down, Spark knows how to prepare a certain data set because it is aware of various transformations and actions that have lead to the dataset in the form of a DAG, it will be able to apply the same transformations and actions to prepare the lost partition of the node which has gone down.
Hope this will help!