LogoLogo
search
⌘Ctrlk
  • Pentaho Documentation
LogoLogo
  • Pentaho Data Integration 11.0
  • Basic concepts of ETL in PDI
  • Understanding PDI data types and field metadata
  • Starting the PDI client
  • Use the PDI client perspectives
  • Pipeline Designer
  • Organizing ETL with projects
  • Extracting data into PDI
  • Transforming data with PDI
  • Loading data from PDI
  • Collaborating with PDI
  • Customizing PDI
  • Pentaho Data Integration Plugins
  • Use Plugin Manager
  • Troubleshooting possible data integration issues
  • PDI transformation steps
  • PDI job entries
gitbookPowered by GitBook
block-quoteOn this pagechevron-down
  1. Archived (merged pages)chevron-right
  2. Connecting to Virtual File Systems (archive)

Steps and entries supporting VFS connections

You may have a transformation or a job containing a step or entry that accesses a file on a Virtual File System.

The following steps and entries support VFS connections:

  • Avro Input

  • Avro Output

  • Bulk load from MySQL into filearrow-up-right

  • Bulk load into MSSQLarrow-up-right

  • Bulk load into MySQLarrow-up-right

  • Copybook Input

  • CSV File Input

  • De-serialize from filearrow-up-right

  • Fixed file inputarrow-up-right

  • Get data from XMLarrow-up-right

  • Get File Namesarrow-up-right

  • Get Files Rows Countarrow-up-right

  • Get SubFolder namesarrow-up-right

  • Google Analyticsarrow-up-right

  • GZIP CSV Inputarrow-up-right

  • Job (job entry)

  • JSON Input

  • JSON outputarrow-up-right

  • ORC Input

  • ORC Output

  • Parquet Input

  • Parquet Output

  • Query HCP

  • Read metadata from Copybook

  • Read metadata from HCP

  • Text File Output

  • Transformation (job entry)

  • Write metadata to HCP

Last updated 7 months ago

Was this helpful?

LogoLogo

About

  • Pentaho.com

Support

  • Pentaho Support

Resources

  • Privacy

© 2025 Hitachi Vantara LLC

Was this helpful?