LogoLogo
CtrlK
Try Pentaho Data Integration and Analytics
  • Pentaho Documentation
  • Pentaho Data Integration 10.2
  • Get started with the PDI client
  • Use a Pentaho Repository in PDI
  • Scheduler perspective in the PDI client
  • Streaming analytics
  • Data Integration perspective in the PDI client
  • Advanced Pentaho Data Integration topics
    • PDI and Hitachi Content Platform (HCP)
    • Hierarchical data
    • PDI and Snowflake
    • Copybook steps in PDI
    • Work with the Streamlined Data Refinery
    • Use Command Line Tools to Run Transformations and Jobs
    • Using Pan and Kitchen with a Hadoop cluster
    • Use Carte Clusters
    • Connecting to a Hadoop cluster with the PDI client
    • Partitioning data
      • Get started
      • Use partitioning
      • Partitioning clustered transformations
      • Learn more
    • Pentaho Data Services
    • Data lineage
    • Use the Pentaho Marketplace to manage plugins
  • Troubleshooting possible data integration issues
  • PDI transformation steps
  • PDI job entries
Powered by GitBook
On this page

Was this helpful?

  1. Advanced Pentaho Data Integration topics
  2. Partitioning data

Learn more

  • Set up a Carte cluster

PreviousPartitioning clustered transformationsNextPentaho Data Services

Last updated 4 months ago

Was this helpful?

LogoLogo

About

  • Pentaho.com

Support

  • Pentaho Support

Resources

  • Privacy

© 2025 Hitachi Vantara LLC