LogoLogo
CtrlK
Try Pentaho Data Integration and Analytics
  • Pentaho Documentation
  • Pentaho Data Integration 10.2
  • Get started with the PDI client
  • Use a Pentaho Repository in PDI
  • Scheduler perspective in the PDI client
  • Streaming analytics
  • Data Integration perspective in the PDI client
  • Advanced Pentaho Data Integration topics
  • Troubleshooting possible data integration issues
  • PDI transformation steps
    • Abort
    • Add a Checksum
    • Add sequence
    • AMQP Consumer
    • AMQP Producer
    • Avro Input
    • Avro Output
    • Calculator
    • Catalog Input
    • Catalog Output
    • Common Formats
    • Copybook Input
    • CouchDB Input
    • CSV File Input
    • Data types
    • Delete
    • Discover metadata from a text file
    • Elasticsearch REST bulk insert
    • ETL metadata injection
    • Execute Row SQL Script
    • Execute SQL Script
    • Extract to Rows
    • File exists (Step)
    • Generate rows
    • Get records from stream
    • Get rows from result
    • Get System Info
    • Google Analytics v4
    • Group By
    • Hadoop File Input
    • Hadoop File Output
    • HBase Input
    • HBase Output
    • HBase row decoder
    • Hierarchical JSON Input
    • Hierarchical JSON Output
    • Java filter
    • JMS Consumer
    • JMS Producer
    • Job Executor
    • JSON Input
    • Kafka consumer
    • Kafka Producer
    • Kinesis consumer
    • Kinesis Producer
    • Mapping
    • MapReduce Input
    • MapReduce Output
    • Memory Group By
    • Merge rows (diff)
    • Microsoft Access input
    • Microsoft Excel Input
    • Microsoft Excel Output
    • Microsoft Excel writer
    • Modified Java Script Value
    • Modify values from a single row
    • Modify values from grouped rows
    • Mondrian Input
    • MongoDB Execute
    • MongoDB Input
    • MongoDB Output
    • MQTT Consumer
    • MQTT Producer
    • ORC Input
    • ORC Output
    • Parquet Input
    • Parquet Output
    • Pentaho Reporting Output
    • Python Executor
    • Query HCP
    • Query metadata from a database
    • Read Metadata
    • Read metadata from Copybook
    • Read metadata from HCP
      • General
      • Options
      • See also
    • Regex Evaluation
    • Replace in String
    • REST client step
    • Row Denormaliser
    • Row Flattener
    • Row Normaliser
    • S3 CSV Input
    • S3 File Output
    • Salesforce bulk operation
    • Salesforce Delete
    • Salesforce Input
    • Salesforce Insert
    • Salesforce Update
    • Salesforce Upsert
    • Select Values
    • Set Field Value
    • Set Field Value to a Constant
    • Simple Mapping (sub-transformation)
    • Single Threader
    • Sort rows
    • Split Fields
    • Splunk Input
    • Splunk Output
    • String Operations
    • Strings cut
    • Switch-Case
    • Table Input
    • Table Output
    • Text File Input
    • Text File Output
    • Transformation Executor
    • Unique Rows
    • Unique Rows (HashSet)
    • User Defined Java Class
    • Write Metadata
    • Write metadata to HCP
    • XML Input Stream (StAX)
    • XML Output
  • PDI job entries
Powered by GitBook
On this page

Was this helpful?

  1. PDI transformation steps
  2. Read metadata from HCP

See also

Query HCP

Write metadata to HCP

Hitachi Content Platform

PreviousOptionsNextRegex Evaluation

Last updated 4 months ago

Was this helpful?

LogoLogo

About

  • Pentaho.com

Support

  • Pentaho Support

Resources

  • Privacy

© 2025 Hitachi Vantara LLC