Set up a VFS location for schedule outputs
With the VFS connection in the Pentaho User Console you can use cloud storage locations to store the report-generated output files. You must be an administrator to create, edit, or delete the VFS connection, to designate roles access connections, and specify the folder. Supported VFS locations are:
Amazon S3/Minio/HCP
Azure Data Lake Gen 1
Azure Data Lake Gen 2 / Blob
Google Cloud Storage
HCP REST
Local
SMB/UNC Provider
Note: If you want to use the local physical file system of your machine as a VFS location to store outputs, see Set up a Local VFS location for schedule outputs.
Perform the following steps to create a VFS connection from the Pentaho User Console:
Click Home and then click Administration.
The Administration perspective opens.
Click VFS Connections and then click the plus sign to add a connection.
The New VFS Connection dialog box for general settings opens.
New VFS Connection dialog box Enter a Connection Name, and optionally a Description.
Click Connection Type and select the type of connection that you want to create from the list:
Amazon S3/Minio/HCP
Azure Data Lake Gen 1
Azure Data Lake Gen 2 / Blob
Google Cloud Storage
HCP REST
Local
SMB/UNC Provider
In Access Roles, click a role or use CtrlClick to select multiple roles from the list to assign permissions for access to the VFS connection and folder.
The defaults are Administrator and Authenticated.
Click Next.
The New VFS Connection dialog box for connection details opens.
Enter the connection type details and options. Choose from:
Connection Type
Options
Amazon
Click Connection Type and select Amazon.
Simple Storage Service (S3) accesses the resources on Amazon Web Services. See Working with AWS Credentials for Amazon S3 setup instructions.
Select the Authentication Type: - Access Key/Secret Key - Credentials File
Select the Region.
When Authentication Type is:
Access Key/Secret Key, then enter the Access Key and Secret Key, and optionally enter the Session Token.
Credentials File, then enter the Profile Name and the File Location.
Select the Default S3 Connection checkbox to make Amazon the default S3 connection.
Minio/HCP
Click Connection Type and select Minio/HCP.
Minio accesses data objects on an Amazon compatible storage server. See the Minio Quickstart Guide for Minio setup instructions.
Enter the Access Key.
Enter the Secret Key.
Enter the Endpoint.
Enter the Signature Version.
Select the PathStyle Access checkbox to use path-style requests. Otherwise, Amazon S3 bucket-style access is used.
Select the Default S3 Connection checkbox to make Minio/HCP the default S3 connection.
Azure Data Lake Gen 1
Accesses data objects on Microsoft Azure Gen 1 storage services. You must create an Azure account and configure Azure Data Lake Storage Gen 1. See Access to Microsoft Azure for more information.
The Authentication Type is Service-to-service authentication.
Enter the Account Fully Qualified Domain Name.
Enter the Application (client) ID.
Enter the Client Secret.
Enter the OAuth 2.0 token endpoint.
Azure Data Lake Gen 2 / Blob
Accesses data objects on Microsoft Azure Gen 2 and Blob storage services. You must create an Azure account and configure Azure Data Lake Storage Gen 2 and Blob Storage. See Access to Microsoft Azure for more information.
Select the Authentication Type: - Account Shared Key - Azure Active Directory - Shared Access Signature
Enter the Service Account Name.
Enter the Block Size (Min 1 MB to Max 100 MB).The default is 50.
Enter the Buffer Count (Min 2). The default is 5.
Enter the Max Block Upload Size (Min 1 MB to 900 MB). The default is 100.
Select the Access Tier. The default value is Hot.
When Authentication Type is:
Account Shared Key, then enter the Service Account Shared Key.
Azure Active Directory, then enter the Application (client) ID, Client Secret, and Directory (tenant) ID.
Shared Access Signature, then enter the Shared Access Signature.
Google Cloud Storage
Accesses data objects on the Google Cloud Storage file system. See Google Cloud Storage for information.
Enter the Service Account Key Location.
HCP REST
Accesses data objects on the Hitachi Content Platform. You must configure HCP and PDI before accessing the platform. You must also configure object versioning in HCP Namespaces. See Access to HCP for more information.
Enter the Host and Port number.
Enter the Tenant, Namespace, Username, and Password.
Click More options then enter the Proxy Host and Proxy Port number.
Select whether to use Accept self-signed certificate. The default is No.
Select whether the Proxy is secure. The default is No.
Local
Accesses a file system on your local machine. See Set up a Local VFS location for schedule outputs for details.
SMB/UNC Provider
Accesses Server Message Block data using a Universal Naming Convention string to specify the file location.
Enter the Domain. The domain name of the target machine hosting the resource. If the machine has no domain name (for example, a home computer), then use the name of the machine.
Enter the Port Number. The default is 445.
Enter the Server, User Name, and Password.
8. Click **Test** to test the connection.
The test results open.
9. Click Next.
The New VFS Connection dialog box for root folder details opens.
10. In the Root Folder Path field, enter the path for your VFS connection output folder. Enter the full path to set a local connection to a specific folder. Optionally, use an empty path to allow access to the root and all its folders.
The default is to the root and its folders in your local physical file system.
11. Click Test to test the connection.
The test results open.
12. Click Next.
The New VFS Connection summary panel opens with all the information you have entered for the connection including the **Root Folder Path**.
13. If needed, click the pencil icon to edit the connection.
Click Finish.
The setup completes and test results appear. A New VFS Connection dialog box opens with options to Create new VFS connection or Edit this connection. If needed, click the Edit this connection to make changes to the connection.
Click Finish.
Your new connection is created and listed in the VFS Connections panel.
To edit a VFS connection, select the connection and click the Edit Connection icon. To delete the VFS connection, click the Delete Connection icon.
Last updated
Was this helpful?