Audience and prerequisites

The audience for this article is ETL developers, data engineers, and data analysts.

Before you begin, verify that the Hadoop administrator has set up your user account on the cluster and granted permissions to access the applicable HDFS directories. You need access to the home directory and any other directories required for the tasks.

Pentaho ships with a default Apache Hadoop driver already installed. Supported versions of other drivers, including Amazon EMR, Apache Vanilla, Cloudera (CDP), and Google Dataproc, must be downloaded from the Support Portal. You must have a driver for each vendor of Hadoop for connecting to each cluster, which should be preinstalled and available for selection before adding a cluster connection.

When drivers for new Hadoop versions are released, you can download them from the Support Portal and then add them to Pentaho to connect to the new Hadoop distributions. Install these drivers using the procedure specified in the Install Pentaho Data Integration and Analytics document.

Verify that the Hadoop administrator has configured the Pentaho Server to connect to the Hadoop cluster on the computer. Ask the Hadoop administrator to provide you with a copy of the site.xml files from the cluster and the following information:

  • Distribution and version of the cluster.

  • IP addresses and port numbers for HDFS, JobTracker, and Zookeeper (if used).

  • Kerberos and cluster credentials if you are connecting to a secured cluster.

  • Oozie URL (if used).

Last updated

Was this helpful?