Spark Tutorial different components of Spark

Updated:01/20/2021 by Computer Hope

Spark applications run as independent sets of processes on a cluster. The driver program & Spark context takes care of the job execution within the cluster.
A job is split into multiple tasks that are distributed over the worker node. When an RDD is created in Spark context, it can be distributed across various nodes. Worker nodes are slaves that execute different tasks.

The following illustration depicts the different components of Spark.
Spark Tutorial  different components of Spark