LogoLogo
Ctrlk
Try Pentaho Data Integration and Analytics
  • Pentaho Documentation
  • Pentaho Data Integration
  • Get started with the PDI client
  • Use a Pentaho Repository in PDI
  • Data Integration perspective in the PDI client
  • Schedule perspective in the PDI client
  • Streaming analytics
  • Advanced topics
  • Troubleshooting possible data integration issues
  • PDI transformation steps
    • Abort
    • Add a Checksum
    • Add sequence
    • AMQP Consumer
    • AMQP Producer
    • Avro Input
    • Avro Output
    • Calculator
    • Cassandra Input
    • Cassandra Output
    • Catalog Input
    • Catalog Output
    • Common Formats
    • Copybook Input
    • CouchDB Input
    • CSV File Input
    • Data types
    • Delete
    • ElasticSearch Bulk Insert (deprecated)
    • Elasticsearch REST Bulk Insert
    • ETL metadata injection
    • Execute Row SQL Script
    • Execute SQL Script
    • File exists (Step)
    • Get records from stream
    • Get rows from result
    • Get System Info
    • Group By
    • Hadoop File Input
    • Hadoop File Output
    • HBase Input
    • HBase Output
    • HBase row decoder
    • HBase setup for Spark
    • Java filter
    • JMS Consumer
    • JMS Producer
    • Job Executor
    • JSON Input
    • Kafka consumer
    • Kafka Producer
    • Kinesis Consumer
    • Kinesis Producer
    • Mapping
    • MapReduce Input
    • MapReduce Output
    • Memory Group By
    • Merge rows (diff)
    • Microsoft Excel Input
    • Microsoft Excel Output
    • Microsoft Excel Writer
    • Modified Java Script Value
    • Mondrian Input
    • MongoDB Input
    • MongoDB Output
    • MQTT Consumer
    • MQTT Producer
    • ORC Input
    • ORC Output
    • Parquet Input
    • Parquet Output
    • Pentaho Reporting Output
    • Python Executor
    • Query HCP
      • Before you begin
      • General
      • Options
      • See also
    • Read metadata from HCP
    • Read metadata from Copybook
    • Read Metadata
    • Regex Evaluation
    • Replace in String
    • REST Client
    • Row Denormaliser
    • Row Flattener
    • Row Normaliser
    • S3 CSV Input
    • S3 File Output
    • Salesforce Delete
    • Salesforce Input
    • Salesforce Insert
    • Salesforce Update
    • Salesforce Upsert
    • Select Values
    • Set Field Value
    • Set Field Value to a Constant
    • Simple Mapping (sub-transformation)
    • Single Threader
    • Sort rows
    • Split Fields
    • Splunk Input
    • Splunk Output
    • SSTable Output
    • String Operations
    • Strings cut
    • Switch-Case
    • Table Input
    • Table Output
    • Using Table input to Table output steps with AEL for managed tables in Hive
    • Text File Input
    • Text File Output
    • Transformation Executor
    • Unique Rows
    • Unique Rows (HashSet)
    • User Defined Java Class
    • Write metadata to HCP
    • Write Metadata
    • XML Input Stream (StAX)
    • XML Output
  • PDI job entries
Powered by GitBook
On this page

Was this helpful?

  1. PDI transformation steps
  2. Query HCP

See also

PDI and Hitachi Content Platform (HCP)

Read metadata from HCP

Write metadata from HCP

PreviousOutput tabNextRead metadata from HCP

Last updated 5 months ago

Was this helpful?

LogoLogo

About

  • Pentaho.com

Support

  • Pentaho Support

Resources

  • Privacy

© 2025 Hitachi Vantara LLC