Job (job entry)
The Job job entry executes another job. This supports functional decomposition, so you can split a large job into smaller, more manageable jobs.
Avoid creating recursive jobs (a job that points to itself). A recursive job can eventually fail with an out-of-memory or stack error.
General
Entry name
The unique name of the job entry on the canvas. You can place the same job entry on the canvas multiple times.
Job
Specify the job to execute by entering its path or clicking Browse.
If you select a job that shares the same root path as the current job, PDI inserts ${Internal.Entry.Current.Directory} in place of the common root path. For example, if the current job path is /home/admin/transformation.kjb and you select a job in /home/admin/path/sub.kjb, then the path is converted to ${Internal.Entry.Current.Directory}/path/sub.kjb.
If you are working with a repository, specify the job name. If you are not working with a repository, specify the job XML file name.
Note: Jobs previously specified by reference are automatically converted to use the job name within the Pentaho Repository.
Options
The Job job entry includes several tabs: Options, Logging, Arguments, and Parameters.
Options tab

Run Configuration
Select the run configuration to use. For more information, see Run configurations.
Execute every input row
Run the job once for each input row (loop).
Pass the sub jobs and transformations to the server
If you select a server-based Run Configuration, pass the complete job (including referenced sub-jobs and sub-transformations) to the remote server.
Enable monitoring for sub jobs and transformations
If you select a server-based Run Configuration, monitor child jobs and transformations while the job runs.
Wait for remote job to finish
If you select a server-based Run Configuration, wait until the job finishes running on the server before continuing.
Follow local abort to remote job
If you select a server-based Run Configuration, send the abort signal to the remote job.
Logging tab
By default, if you do not configure logging, PDI writes logs to the parent (calling) job log.

Specify logfile
Write logs for this job to a separate log file.
Name
The directory and base name of the log file, for example, C:\logs.
Extension
The file name extension, for example, .log or .txt.
Log level
The logging level to use while running the job. For more information, see Logging levels.
Append logfile?
Append to the log file instead of creating a new one.
Create parent folder
Create the parent folder for the log file if it does not already exist.
Include date in logfile
Add the system date to the file name in YYYYMMDD format, for example _20051231.
Include time in logfile
Add the system time to the file name in HHMMSS format, for example _235959.
Arguments tab

Use these options to pass arguments to the job:
Copy results to arguments
Copy results from a previous transformation as job arguments by using the Copy rows to result step. If Execute every input row is selected, each row becomes a set of command-line arguments. Otherwise, only the first row is used.
Argument
The command-line arguments to pass to the job.
Parameters tab

Use these options to pass parameters to the job:
Copy results to parameters
Copy results from a previous job as job parameters by using the Copy rows to result step.
Pass parameter values to sub-job
Pass all parameters of the calling job down to the sub-job.
Parameter
The parameter name to pass to the job.
Parameter to use
The field of an incoming record to use as the parameter value. If you enter a field here, Static input value / variable is disabled.
Static input value / variable
Specify values for job parameters by using one of these methods:
Enter a value directly, for example
ETL Job.Use a variable, for example
${Internal.Job.Name}.Combine values and variables, for example
${FILE_PREFIX}_${FILE_DATE}.txt.
If you enter a value here, Parameter to use is disabled.
Get Parameters
Get the existing parameters defined for the selected job.
Last updated
Was this helpful?

