PDI big data transformation steps
You can use the following Pentaho Data Integration transformation steps to help enable PDI to work with big data technologies:
Avro Input
Avro Output
CouchDB
Hadoop File Input
Hadoop File Output
HBase Input
HBase Output
HBase Row Decoder
Kafka Consumer
Kafka Producer
MapReduce Input
MapReduce Output
MongoDB Input
MongoDB Output
ORC Input
ORC Output
Parquet Input
Parquet Output
Splunk Input
Splunk Output
See the Transformation step reference in the Pentaho Data Integration document for details and additional job entries.
Last updated
Was this helpful?