Logical Planning and Physical Planning: in spark

8/29/2021

Spark Logical Planning and Physical Planning in bigdata

Go Back

Top Interview Questions on Spark: Understanding Logical Planning and Physical Planning

Updated: January 2025 | By Developer Indian Team

spark #spark #optimization #sparkrdd #sparkdataframe #sql
 

Logical Plan refers to an abstract of all transformation steps that need to be executed at time of saprk submit. Similarly logical plan represents the abstract, high-level representation of the Spark job's computation.

Spark Logical Planning and Physical Planning  in bigdata

Spark Logical Plan

Taking user code and converting it into a logical plan is the first phase of execution. The logical plan only converts the user’s set of expressions into the most optimized version. It does this by converting the user code into an unresolved logical plan. This logical plan is unresolved because although your code might be valid, the tables or columns that it refers to may or may not exist. Spark uses a repository of all table, catalog and DataFrame information to resolve columns and tables in the analyzer. The analyzer may reject the unresolved logical plan if the required table or column name does not exist in the catalog. If the analyzer is able to resolve it, the result is passed through the Catalyst Optimizer. The Packages can extend Catalyst to include their own rules for domain specific optimizations.

Physical Planning:

After successfully creating optimized logical plan, Spark begins the physical planning process. The physical plan, called  Spark plan, specifies how the logical plan will execute on the cluster by generating different physical execution strategies and comparing them through a cost model. An example of the cost comparison could be choosing how to perform a given join by looking at the physical attributes of a given table. A series of RDDs and transformations is the results of Physical Planning. This result is why it is sometimes said that Spark is referred to as a compiler—it takes queries in DataFrames, Datasets, and SQL and compiles them into RDD transformations.

What is Spark Logical Planning?

Logical Planning is the first phase of Spark execution. It involves converting user code into an optimized logical plan. Here’s how it works:

  1. Unresolved Logical Plan: Spark starts by converting user code into an unresolved logical plan.
  2. Analyzer: Spark uses the catalog to resolve tables and columns.
  3. Catalyst Optimizer: The logical plan is optimized using rule-based and cost-based techniques.

Example of Logical Planning:


val df = spark.read.table("sales")
val result = df.filter($"amount" > 1000)
    

What is Spark Physical Planning?

After creating an optimized logical plan, Spark moves to Physical Planning. This phase determines how the logical plan will be executed on the cluster.

Example of Physical Planning:

For a join operation, Spark might evaluate strategies like broadcast join or sort-merge join based on the size of the datasets.

Key Differences Between Logical and Physical Plans

Aspect Logical Plan Physical Plan
Purpose Optimizes user code into a logical plan. Determines how to execute the logical plan on the cluster.
Output Unresolved and resolved logical plans. RDDs and transformations.
Optimization Uses Catalyst Optimizer. Uses a cost model to choose the best execution strategy.

Top Interview Questions on Spark Logical and Physical Planning

  1. What is Spark Logical Planning?
  2. What is Spark Physical Planning?
  3. What is the role of the Catalyst Optimizer?
  4. How does Spark resolve tables and columns during logical planning?
  5. What are RDDs, and how are they related to physical planning?
  6. What is the difference between logical and physical plans in Spark?

Conclusion

Understanding Spark Logical Planning and Physical Planning is essential for mastering Apache Spark. Logical planning optimizes user code, while physical planning determines how the code is executed on the cluster. By mastering these concepts, you’ll be well-prepared for Spark interviews and real-world big data challenges.

Ready to learn more? Check out our complete guide to Apache Spark or explore advanced topics like Catalyst Optimizer.

Table of content