Audience and prerequisites

The audience for this article is ETL developers, data engineers, and data analysts.

Before you begin, verify that your Hadoop administrator has set up your user account on the cluster and granted permissions to access the applicable HDFS directories. You need access to your home directory and any other directories required for your tasks.

Pentaho ships with a default Apache Hadoop driver already installed. Supported versions of drivers for Amazon EMR and Google Dataproc that you can install on the PDI client must be downloaded from the Support Portal. You must have a driver for each vendor and version of Hadoop for connecting to each cluster. You must install a driver before it is available for selection when you add a new connection to the cluster.

When drivers for new Hadoop versions are released, you can download them from the Support Portal and then add them to Pentaho to connect to the new Hadoop distributions. Install these drivers using the procedure specified in the Install Pentaho Data Integration and Analytics document.

Verify that your Hadoop administrator has configured the Pentaho Server to connect to the Hadoop cluster on your computer. Ask your Hadoop administrator to provide you with a copy of the site.xml files from the cluster and the following information:

  • Distribution and version of the cluster.

  • IP addresses and port numbers for HDFS, JobTracker, and Zookeeper (if used).

  • Kerberos and cluster credentials if you are connecting to a secured cluster.

  • Oozie URL (if used).

Last updated

Was this helpful?