Sample class scenarios
For each of the following embedding scenarios, a sample class can be executed as a stand-alone Java application:
Run Transformations
Run Jobs
Dynamically Build Transformations
Dynamically Build Jobs
Each sample has an associated unit test. To run an individual sample, execute the following command:
mvn test -Dtest=<sample unit test class>
The following sections describe how to use these samples as templates for embedding PDI in your applications.
Run transformations
The org.pentaho.di.sdk.samples.embedding.RunningTransformations
class is an example of how to run a PDI transformation from Java code in a stand-alone application. This class sets parameters and executes the sample transformations in pentaho/design-tools/data-integration/etl
directory. You can run a transform from its KTR file using runTransformationFromFileSystem()
or from a PDI repository using runTransfomrationFromRepository()
.
Consider the following general steps while trying to run an embedded transformation:
Initialize the Kettle environment.
Always make the first call to
KettleEnvironment.init()
whenever you are working with the PDI APIs.Prepare the transformation.
The definition of a PDI transformation is represented by a TransMeta object. You can load this object from a KTR file, a PDI repository, or generate it dynamically. To query the declared parameters of the transformation definition use
listParameters()
. To query the assigned values, usesetParameterValue()
.Execute the transformation.
An executable Trans object is derived from the
TransMeta
object that is passed to the constructor. TheTrans
object starts, then executes asynchronously. To ensure that all steps of theTrans
object have completed, callwaitUntilFinished()
.Evaluate the outcome.
After the
Trans
object completes, you can access the result usinggetResult()
. The Result object can be queried for success by evaluatinggetNrErrors()
. This method returns zero (0) on success and a non-zero value when there are errors. To get more information, retrieve the transformation log lines.Shutdown listeners.
When the transformations have completed, call
KettleEnvironment.shutdown()
to ensure the proper shutdown of all kettle listeners.
Run jobs
The org.pentaho.di.sdk.samples.embedding.RunningJobs
class is an example of how to run a PDI job from Java code in a stand-alone application. This class sets parameters and executes the job in etl/parametrized_job.kjb
. You can run the job from the .kjb
file using runJobFromFileSystem()
or from a repository using runJobFromRepository()
.
Consider the following general steps while trying to run an embedded job:
Initialize the Kettle environment.
Always make the first call to
KettleEnvironment.init()
whenever you are working with the PDI APIs.Prepare the job.
The definition of a PDI job is represented by a JobMeta object. You can load this object from a KTB file, a PDI repository, or generate it dynamically. To query the declared parameters of the job definition use
listParameters()
. To set the assigned values usesetParameterValue()
.Execute the job.
An executable Job object is derived from the
JobMeta
object that is passed to the constructor. TheJob
object starts, then executes in a separate thread. To wait for the job to complete, callwaitUntilFinished()
.Evaluate the outcome.
After the
Job
completes, you can access the result usinggetResult(
). The Result object can be queried for success usinggetResult()
. This method returnstrue
on success andfalse
on failure. To get more information, retrieve the job log lines.Shutdown listeners.
When the transformations have completed, call
KettleEnvironment.shutdown()
to ensure the proper shutdown of all Kettle listeners.
Dynamically build transformations
The org.pentaho.di.sdk.samples.embedding.GeneratingTransformations
class is an example of a dynamic transformation. This class generates a transformation definition and saves it to a KTR file.
Consider the following general steps while trying to dynamically build a transformation:
Initialize the Kettle environment.
Always make the first call to
KettleEnvironment.init()
whenever you are working with the PDI APIs.Create and configure a transformation definition object
A transformation definition is represented by a TransMeta object. Create this object using the default constructor. The transformation definition includes the name, the declared parameters, and the required database connections.
Populate the
TransMeta
object with transformation stepsThe data flow of a transformation is defined by steps that are connected by hops. Perform the following tasks to populate the object with a transformation step:
Create the step by instantiating its class directly and configure it by using its
get
andset
methods.Transformation steps reside in sub-packages of
org.pentaho.di.trans.steps
.To use the Get File Names step, create an instance of org.pentaho.di.trans.steps.getfilenames.GetFileNamesMeta and use its
get
andset
methods to configure it.Obtain the step ID string.
Each PDI step has an ID that can be retrieved from the PDI plugin registry.
A simple way to retrieve the step ID is to call
PluginRegistry.getInstance().getPluginId(StepPluginType.class, theStepMetaObject)
.Create an instance of
org.pentaho.di.trans.step.StepMeta
by passing the step ID string, the name, and the configured step object to the constructor.An instance of
StepMeta
encapsulates the step properties, as well as controls the placement of the step on the PDI client (Spoon) canvas and connections to hops.Once the
StepMeta
object has been created, callsetDrawn(true)
andsetLocation(x,y)
to make sure the step appears correctly on the PDI client canvas.Add the step to the transformation, by calling
addStep()
on the transformation definition object.
Connect the hops.
Once steps have been added to the transformation definition, they need to be connected by hops. To create a hop, create an instance of org.pentaho.di.trans.TransHopMeta, passing in the From and To steps as arguments to the constructor. Add the hop to the transformation definition by calling
addTransHop()
.
After all steps have been added and connected by hops, the transformation definition object can be serialized to a KTR file by calling getXML()
and opening it in the PDI client for inspection. The sample class org.pentaho.di.sdk.samples.embedding.GeneratingTransformations
generates the following example transformation:

Dynamically build jobs
The org.pentaho.di.sdk.samples.embedding.GeneratingJobs
class is an example of a dynamic job. This class generates a job definition and saves it to a KJB file.
Consider the following general steps while trying to dynamically build a job:
Initialize the Kettle environment.
Always make the first call to
KettleEnvironment.init()
whenever you are working with the PDI APIs.Create and configure a job definition object.
A job definition is represented by a JobMeta object. Create this object using the default constructor. The job definition includes the name, the declared parameters, and the required database connections.
Populate the
JobMeta
object with job entries.The control flow of a job is defined by job entries that are connected by hops. Perform the following tasks to populate the object with a job entry:
Create the entry by instantiating its class directly and configure it by using its
get
andset
methods.Job entries reside in sub-packages of
org.pentaho.di.job.entries
.Use the File Exists job entry, create an instance of org.pentaho.di.job.entries.fileexists.JobEntryFileExists, and use
setFilename()
to configure it. The Start entry is implemented by org.pentaho.di.job.entries.special.JobEntrySpecial.Create an instance of org.pentaho.di.job.entry.JobEntryCopy by passing the entry created in the previous step to the constructor.
An instance of
JobEntryCopy
encapsulates the properties of an entry, as well as controls the placement of the entry on the PDI client canvas and connections to hops.Once created, call
setDrawn(true)
andsetLocation(x,y)
to make sure the entry appears correctly on the PDI client canvas.Add the entry to the job by calling
addJobEntry()
on the job definition object.It is possible to place the same entry in several places on the canvas by creating multiple instances of
JobEntryCopy
and passing in the same entry instance.
Connect the hops.
Once entries have been added to the job definition, they need to be connected by hops. To create a hop, create an instance of
[org.pentaho.di.job.JobHopMeta](http://javadoc.pentaho.com/kettle530/kettle-engine-5.3.0.0-javadoc/org/pentaho/di/job/JobHopMeta.html)
, by passing in the From and To entries as arguments to the constructor. Configure the hop consistently. Configure it as a green or red hop by callingsetConditional()
andsetEvaluation(true/false)
. If it is an unconditional hop, callsetUnconditional()
. Add the hop to the job definition by callingaddJobHop()
.
After all entries have been added and connected by hops, the job definition object can be serialized to a KJB file by calling getXML()
, and opened in the PDI client for inspection. The sample class org.pentaho.di.sdk.samples.embedding.GeneratingJobs
generates the following example job:

Last updated
Was this helpful?