Importing and updating email addresses used for scheduling from data sources
You can adjust and run sample transformations to import email addresses from LDAP or JDBC sources to be used for scheduling notifications from the Pentaho Server via the Pentaho User Console (PUC). Once you have initially imported the data, you can schedule the transformations to run periodically to update the email addresses based on the LDAP or JDBC sources.
You can find the following sample transformations in the server/pentaho-server/pentaho-solutions/email-import-samples
directory:
For LDAP sources:
LDAPEmailImportV3.ktr
For JDBC sources:
JDBCEmailImportV3.ktr
You can also use an optional parameter defined for these transformations and a related column in the Pentaho email Hibernate database table to manage multiple sources. Using the parameter and related column helps to keep emails from different sources from interfering with each other. The transformations are designed to only act upon rows in the Hibernate table that match this optional parameter. Any inserts, deletions, or update only apply to those rows with the column that matches the parameter. For example, if you have multiple LDAP servers for different local or business units, such as LDAP-US, LDAP-EU, and LDAP-ASIA. You can adjust the transformation parameter for each one of these sources to import and maintain email addresses from each server without affecting the others.
Import and update email addresses
Perform the following steps in the Pentaho Data Integration (PDI) client to adjust the sample JDBC transformation, then run the transformation to import the email addresses:
Open the sample
JDBCEmailImportV3.ktr
transformation in the PDI client.See the Open a transformation section in the Pentaho Data Integration document for details.
Select Properties from the menu that appears when you right-click in an empty area of the PDI client canvas.
The Transformation properties dialog box opens.
Click the Parameters tab to access the
MANAGED_SOURCE
andPENTAHO_BASE_URL
transformation parameters.Specify a source of the email address data for the
MANAGED_SOURCE
parameter.As a best practice, you should use and maintain separate transformations for each source. For example, if you have multiple JDBC databases for different local or business units (such as
JDBC-US
andJDBC-ASIA
), you should save a version of theJDBCEmailImportV3
transformation per each source withMANAGED_SOURCE
set toJDBC-US
for one transformation andMANAGED_SOURCE
set toJDBC-ASIA
for the other transformation.Specify the URL of your Pentaho User Console (PUC) for the
PENTAHO_BASE_URL
parameter.By default, in a standard installation, the URL for PUC is
http://localhost:8080/pentaho
, yet your Pentaho administrator can and may have configured it to be different from the default. Check with your Pentaho administrator if you are not sure of the URL used for your instance of PUC.Click OK to close the Transformation properties dialog box.
With the transformation parameters, you need to specify the related Pentaho email Hibernate database table columns before running the transformation.
Double click on the JDBC Input step in the transformation.
The Table input step properties dialog box opens.
Specify the column names used for the email source, last names, and first names in your JDBC source with the SQL statements shown in the SQL text box.
The column name specified for the email source should match the value you specified for the
MANAGED_SOURCE
parameter.Click OK to save your specified values and close the Table input step properties dialog box.
Save the transformation as a filename specific to your JDBC source (
JDBCEmailImportForJDBC-US.ktr
for theJDBC-US
managed source for example), then run the transformation.See the Run your transformation section in the Pentaho Data Integration document for details.
The email addresses from your JDBC source should now appear on the Email Groups page under the Administration perspective in your PUC instance. You can now use this same transformation to update the email addresses periodically by setting it up to run on a schedule. See the Schedule a transformation or job section in the Pentaho Data Integration document for details.
Last updated
Was this helpful?