Tuning the Spark application parameters

Spark transformations are lazy loaded. They are only loaded into a process when they are needed, which are triggered by actions. Some examples of these actions are count, collect, take, and save.

Spark views PDI transformations as applications. If you execute a PDI transformation on a Spark engine, the PDI transformation appears in Spark as an application on your cluster.

Last updated

Was this helpful?