LogoLogo
Ctrlk
Try Pentaho! - Start your 30 day evaluation today
  • Pentaho Documentation
  • Pentaho Data Integration 11.0
  • Pipeline Designer
  • Organizing ETL with projects
  • Extracting data into PDI
    • Defining PDI database connections
    • Edit database connections in PDI
    • Specify advanced configuration of PDI database connections
    • Quoting PDI database connections
    • Set specific options for PDI database connections
    • Define PDI database connection pooling
    • Connect to clusters (PDI only)
    • Modify connections from PDI
    • PDI and Hitachi Content Platform (HCP)
    • Hierarchical data
    • PDI and Snowflake
    • Copybook steps in PDI
    • Work with the Streamlined Data Refinery
    • Connecting to a Hadoop cluster with the PDI client
    • Connecting to Virtual File Systems
      • Before you begin
      • Create a VFS connection
      • Edit a VFS connection
      • Delete a VFS connection
      • Access files with a VFS connection
      • Pentaho address to a VFS connection
      • Create a VFS metastore
      • Steps and entries supporting VFS connections
      • VFS browser
    • Streaming analytics
    • Web services steps
  • Transforming data with PDI
  • Loading data from PDI
  • Collaborating with PDI
  • Customizing PDI
  • Troubleshooting possible data integration issues
  • PDI transformation steps
  • PDI job entries
Powered by GitBook
On this page
  1. Extracting data into PDI
  2. Connecting to Virtual File Systems

Steps and entries supporting VFS connections

You may have a transformation or a job containing a step or entry that accesses a file on a Virtual File System.

The following steps and entries support VFS connections:

  • Avro Input

  • Avro Output

  • Bulk load from MySQL into file

  • Bulk load into MSSQL

  • Bulk load into MySQL

  • Copybook Input

  • CSV File Input

  • De-serialize from file

  • Fixed file input

  • Get data from XML

  • Get File Names

  • Get Files Rows Count

  • Get SubFolder names

  • Google Analytics

  • GZIP CSV Input

  • Job (job entry)

  • JSON Input

  • JSON output

  • ORC Input

  • ORC Output

  • Parquet Input

  • Parquet Output

  • Query HCP

  • Read metadata from Copybook

  • Read metadata from HCP

  • Text File Output

  • Transformation (job entry)

  • Write metadata to HCP

PreviousCreate a VFS metastoreNextVFS browser

Last updated 5 months ago

Was this helpful?

LogoLogo

About

  • Pentaho.com

Support

  • Pentaho Support

Resources

  • Privacy

© 2025 Hitachi Vantara LLC

Was this helpful?