Docker container deployment of Pentaho Server

You can use Docker Compose to deploy Pentaho Server as a container and, optionally, to install Pentaho Server plugins during deployment in any of the following supported environments:

  • On‑premises

  • Amazon Web Services (AWS)

  • Google Cloud Platform (GCP)

  • Microsoft Azure

Download Pentaho Server Docker files

To deploy Pentaho Server as a container using Docker Compose, you must first download both the .gz file that contains the Docker image and the ZIP file that contains the configuration files for your environment.

Complete the following steps to download the ZIP files that you need for deploying Pentaho Server as a container:

  1. On the Support Portalarrow-up-right home page, sign in using the Pentaho Support username and password provided in your Pentaho Welcome Packet.

  2. In the Pentaho card, click Download. The Downloads page opens.

  3. In the 11.x list, click Pentaho 11.0 GA Release.

  4. Scroll to the bottom of the Pentaho 11.0 GA Release page.

  5. In the file component section, navigate to the Docker Image Configurator/Images directory.

  6. Download the pentaho-server-11.0.0.0-<build number>.tar.gz file.

  7. In the file component section, navigate to the Docker Image Configurator/Environment Config directory.

  8. Download one of the following ZIP files that contain the configuration files for your environment:

    • aws-11.0.0.0-<build number>.zip

    • azure-11.0.0.0-<build number>.zip

    • gcp-11.0.0.0-<build number>.zip

    • on-prem-11.0.0.0-<build number>.zip

What to do next:

Download plugin files

To install Pentaho Server plugins you must download the ZIP files that contain the plugin files.

Complete the following steps to download the ZIP files you need for installing plugins:

  1. On the Support Portalarrow-up-right home page, sign in using the Pentaho Support username and password provided in your Pentaho Welcome Packet.

  2. In the Pentaho card, click Download. The Downloads page opens.

  3. In the 11.x list, click Pentaho 11.0 GA Release.

  4. Scroll to the bottom of the Pentaho 11.0 GA Release page.

  5. In the file component section, navigate to Pentaho Server/Archive Build (Suggested Installation Method).

  6. Download the files for one or more of the following plugins:

    • Dashboard Designer Plugin: pdd-plugin-ee-11.0.0.0-<build number>.zip

    • Interactive Reporting Plugin: pir-plugin-ee-11.0.0.0-<build number>.zip

    • Pentaho Analyzer Plugin: paz-plugin-ee-11.0.0.0-<build number>.zip

  7. Extract the plugin ZIP files to a temporary directory. The extracted directories contain the following subdirectories are needed for installing the plugins:

    • pentaho-interactive-reporting

    • dashboards

    • analyzer

circle-info

Note: If you do not install plugins during the first deployment of the Pentaho Server, you can install them later. For instructions, see Install Pentaho Server plugins after deployment.

Deploy Pentaho Server on premises

You can use Docker Compose to deploy Pentaho Server as a container on a local machine, AWS EC2 instance, Google Cloud Platform virtual machine (VM), or Microsoft Azure VM.

On-premises deployment is suitable for the following environments:

  • Evaluation environments.

  • Development and test environments.

  • Production environments with an external database, persistent volumes, backups, and security hardening.

Best practices for production environments

  • Use an external, managed database—such as Amazon Relational Database Service, Microsoft Azure Database, Google Cloud SQL, or a corporate-managed database—instead of the bundled database container.

  • Mount persistent storage for solutions, data, logs, and licenses.

  • Configure HTTPS.

  • Regularly back up your volumes and database.

  • Monitor your deployment with Docker logs, cloud metrics, and alerts.

  • Apply security patches to your operating system and Docker.

Before you begin

Before you can deploy to an on-premises host, you must complete the following tasks:

  • Create a Docker account.

    circle-info

    Note: For help with Docker and Docker Hub, see the online documentation at https://docs.docker.com/arrow-up-right.

  • On the host, install Docker Engine and Docker Compose.

  • Download both the ZIP file that contains the Docker image and the ZIP file that contains the configuration files for your environment. For instructions downloading Docker files, see the previous section, Download Docker container files.

  • (Optional) Download plugin files for installing the plugins during deployment.

circle-info

Note: For help with Docker and Docker Hub, see the online documentation at https://docs.docker.com/arrow-up-right.

Procedure

Complete the following steps to deploy Pentaho Server as a container, on premises:

  1. On your host, extract the on-prem-11.0.0.0-<build number>.zip ZIP file to a temporary working directory.

  2. In your working directory, open a command prompt.

  3. (Optional) If you are not logged in, log into Docker Hub using the following command: docker login

  4. Load the Pentaho Server Docker image by running the following command, replacing <build number> with the build number in the downloaded file:

    You can verify that the Pentaho Server image is loaded in Docker Desktop.

  5. In your working directory, go to the on-prem-11.0.0.0-<build number>/dist/on-prem/pentaho-server subdirectory.

  6. Open one of the following directories based on your database type:

    • pentaho-server-mysql

    • pentaho-server-oracle

    • pentaho-server-postgres

    • pentaho-server-sqlserver

  7. In a text editor, open the .env file and configure variables for your environment.

    circle-info

    Important: You must enter the URL for your Pentaho license.

    The contents of the .env file vary slightly for each database and include comments to assist you with editing the file. The following code is an example of the .env file contents used for a PostgreSQL database.

  8. Save and close the .env file.

  9. (Optional) To install Pentaho Server plugins during deployment, place the directory for each plugin in the softwareOverride\2_repository\pentaho-solutions\system subdirectory for your database type. The following subdirectories are needed for installing the plugins:

    • pentaho-interactive-reporting

    • dashboards

    • analyzer

    circle-info

    Note: For instructions downloading plugin files, see Download plugin files.

  10. (Optional) To change the default administrator password before deploying the Pentaho Server and database, go to the in the softwareOverride\2_repository\pentaho-solutions\system subdirectory and update the defaultUser.spring.properties and repository.spring.properties files. For instructions, see Change the default administrator password.

  11. In your command prompt, change to the directory for your database type.

  12. Deploy the Pentaho Server as a container and create the database volume by running one of the following commands based on your database type:

    • docker compose -f docker-compose-mysql.yaml up

    • docker compose -f docker-compose-postgres.yaml up

    • docker compose -f docker-compose-sqlserver.yaml up

    • docker compose -f docker-compose-oracle.yaml up

  13. Verify that Pentaho Server is running by accessing it at http://localhost:<port>/pentaho or your proxy URL.

Troubleshooting on-premises deployment

The following table lists the symptoms, causes, and suggested fixes for common issues related to on-premises deployments of Pentaho Server as a container.

Symptom
Cause
Fix

Pentaho Server container restarts repeatedly

Insufficient memory

Increase the JVM -Xmx or VM size.

Cannot access web UI

Port blocked by firewall

Open the port for your local host in the security group, Network Security Group, or firewall.

circle-info

Note: The default port is 8080.

DB connection errors

Wrong .env values

Verify the values for DB_HOST, DB_USER, DB_PASSWORD in the .env file.

Slow performance

Under-powered VM

Use recommended size, allocate more resources.

Deploy Pentaho Server on Kubernetes in the cloud

You can deploy the Pentaho Server as a container on Kubernetes in any of the following cloud environments:

Before you begin

Before you can deploy Pentaho Server using Kubernetes, you must create a Docker account.

circle-info

Note: For help with Docker and Docker Hub, see the online documentation at https://docs.docker.com/arrow-up-right.

Deploy on Amazon Web Services

To deploy on Amazon Web Services (AWS) using Elastic Kubernetes Service (EKS), you must complete the following tasks, in order:

Before you begin, complete the following tasks:

  • Create a Docker account.

    circle-info

    Note: For help with Docker and Docker Hub, see the online documentation at https://docs.docker.com/arrow-up-right.

  • Verify you have access to a standard Amazon EKS cluster.

  • Verify that you have an approved S3 CSI or FUSE-based approach. (Used for configs, logs, and overrides and required if your package YAMLs reference S3 buckets for configuration or storage paths.)

  • If you plan to pull Pentaho images from Amazon Elastic Container Registry (ECR), or a mirrored ECR, confirm that the node instance role or IRSA-enabled service account has permission to pull images.

  • Install kubectl and AWS CLI.

  • Set up your local kubeconfig so that kubectl can communicate with the cluster by running the following command, replacing <name> and <region>:

  • Download both the ZIP file that contains the Docker image and the ZIP file that contains the configuration files for your environment. For instructions downloading Docker files, see the previous section, Download Docker container files.

Tag and push Pentaho Server Docker image to AWS

To tag the Pentaho Server Docker image and upload it to AWS, complete the following steps:

  1. In your working directory, open a command prompt from that subdirectory.

  2. (Optional) If you are not logged in, log into Docker Hub using the following command: docker login.

  3. Authenticate to ECR by running the following command, replacing <aws region> and <aws account id>:

  4. Tag and push the Pentaho Server Docker image by running the following command, replacing <build number>, <aws account id>, and <region> with the values from the downloaded file and your AWS account:

    The Pentaho Server Docker image is tagged and pushed to AWS.

Create a database instance in AWS

To create a database instance in AWS, complete the following steps:

  1. On your local workstation, extract the aws-11.0.0.0-<build number>.zip file to a temporary working directory.

    circle-info

    Note: For instructions on downloading the ZIP file, see the previous section, Download Docker container files.

  2. In your working directory, go to the aws-11.0.0.0-<build number>/dist/aws/pentaho-server subdirectory.

  3. Open one of the following directories based on your database type:

    • pentaho-server-mysql

    • pentaho-server-oracle

    • pentaho-server-postgres

    • pentaho-server-sqlserver

  4. Go to the db_init_<database type> subdirectory and open a command prompt from that subdirectory.

  5. Send the .sql files in the db_init_<database type> subdirectory to your EC2 instance by running the following commands, replacing <your private key>, <database type>, <ssh username>, <e2c instance public ip>, and <region>:

  6. Connect to RDS from EC2. The following command is an example for connecting to RDS from EC2 for a PostgreSQL database.

  7. Run the .sql files from EC2. The following commands are examples for running .sql files from EC2 for a PostgreSQL database.

    The database instance is created in AWS.

Configure storage in AWS S3 buckets

To configure storage in AWS S3 buckets, complete the following steps:

  1. In your working directory, go to the aws-11.0.0.0-<build number>/dist/aws/pentaho-server subdirectory.

    circle-info

    Note: You downloaded and extracted the aws-11.0.0.0-<build number>.zip file in the previous sections, Download Docker container files and Create a database instance in AWS.

  2. Open one of the following directories based on your database type:

    • pentaho-server-mysql

    • pentaho-server-oracle

    • pentaho-server-postgres

    • pentaho-server-sqlserver

  3. Open the .yaml file in a text editor and configure values for your environment. You must update the following values before deploying the Pentaho Server:

    The contents of the .yaml file vary slightly for each database and include comments to assist you with editing the file. The following code is an example of the .yaml file contents used for a PostgreSQL database when deploying to AWS.

  4. Save and close the .yaml file.

  5. Go to the softwareOverride/2_repository/tomcat/webapps/pentaho/META-INF subdirectory.

  6. Open the context.xml file in a text editor and update the database URL everywhere it appears. The following code is an example of the content in the context.xml file for a PostgreSQL database. In this example, the database instance URL is jdbc:postgresql://repository:5432/hibernate.

  7. (Optional) To install Pentaho Server plugins during deployment, place the directory for each plugin in the softwareOverride\2_repository\pentaho-solutions\system subdirectory for your database type. The following subdirectories that are needed for installing the plugins:

    • pentaho-interactive-reporting

    • dashboards

    • analyzer

    circle-info

    Note: For instructions downloading plugin files, see Download plugin files.

  8. (Optional) To change the default administrator password before deploying the Pentaho Server, go to the in the softwareOverride\2_repository\pentaho-solutions\system subdirectory and update the defaultUser.spring.properties and repository.spring.properties files. For instructions, see Change the default administrator password.

  9. In your EKS cluster, create an S3 bucket for each of the following subdirectories that appear in the aws-11.0.0.0-<build number>/dist/aws/pentaho-server/<database> directory:

    1. config

    2. logs

    3. softwareOverride

  10. Upload the contents of each subdirectory to the corresponding S3 bucket that you created for them.

  11. Go back to the aws-11.0.0.0-<build number>/dist/aws/pentaho-server/<database> directory, which contains your database .yaml file, and open a command prompt from that directory.

  12. In the command prompt, create the persistent volumes and persistent volume claims, then deploy the Pentaho Server to the Kubernetes cluster by running the following command. Replace <database> with the database name defined in the .yaml file:

    The Pentaho Server is deployed with the storage you configured in the AWS S3 bucket.

Check in logs in Kubernetes pods

To check in logs in Kubernetes pods, complete the following steps:

  1. In a command prompt, check in logs in Kubernetes pods by running the following commands, in order:

  2. Foward the port for the Pentaho Server by running the following command, replacing the <pod name>, <port>, and <namespace>:

    circle-info

    Note: The default port is 8080.

  3. Verify that Pentaho Server is running by accessing it at http://<your external ip>:<port>/pentaho or your Application Load Balancer URL.

Deploy on Google Cloud Platform

To deploy on Google Cloud Platform (GCP) using Google Artifact Registry (GAR), you must complete the following tasks, in order:

Before you begin, complete the following tasks:

  • Create a Docker account.

    circle-info

    Note: For help with Docker and Docker Hub, see the online documentation at https://docs.docker.com/arrow-up-right.

  • Verify that you have access to a standard mode GKE cluster (Autopilot may have memory constraints).

  • Install and authenticate to gcloud CLI.

  • Verify that you have a GCS bucket for persistent storage (mounted via GCS FUSE CSI driver) or an alternative persistent storage class.

  • Download both the ZIP file that contains the Docker image and the ZIP file that contains the configuration files for your environment. For instructions downloading Docker files, see the previous section, Download Docker container files.

Tag and push Pentaho Server Docker image to GAR

To tag and push the Pentaho Server Docker image to GAR, complete the following steps:

  1. In your working directory, open a command prompt from that subdirectory.

  2. (Optional) If you are not logged in, log into Docker Hub using the following command: docker login.

  3. Authenticate to GAR with a specific Artifact Registry host by running the following commands, replacing <region> with the value for your region:

  4. Tag and push the Pentaho Server Docker image by running the following command, replacing <build number>, <region>, <project id>, and <repository> with the values in the downloaded file and your GAR account:

    The Pentaho Server Docker image is tagged and pushed to GAR.

Create a Google Cloud supported database in GCP

Create one of the following databases to be managed by Google Cloud: MySQL, PostgreSQL, or SQL Server.

circle-info

Note: To use an Oracle database, you must provision and configure a Compute Engine (GCE) VM. For instructions creating an Oracle database, see Create an Oracle database instance for GCP.

To create a database for GCP that is managed by Google Cloud, complete the following steps:

  1. On your local workstation, extract the gcp-11.0.0.0-<build number>.zip file to a temporary working directory.

    circle-info

    Note: For instructions on downloading the ZIP file, see the previous section, Download Docker container files.

  2. In your working directory, go to the gcp-11.0.0.0-<build number>/dist/gcp/pentaho-server subdirectory.

  3. Open one of the following directories based on your database type:

    • pentaho-server-mysql

    • pentaho-server-postgres

    • pentaho-server-sqlserver

  4. Go to the db_init_<database type> subdirectory and open a command prompt from that subdirectory.

  5. (Optional) If you are not logged in, log into Google Cloud CLI using the following command: gcloud auth login.

  6. Send the .sql files in the db_init_<database type> subdirectory to your GCS bucket by running the following commands, replacing <database> and <bucket> :

  7. Run the .sql files from the GCS bucket. The following commands are examples for running .sql files from the GCS bucket for a PostgreSQL database.

    The database instance is created and initialized in GCP.

Create an Oracle database instance for GCP

To create an Oracle database for use on Google Cloud Platform (GCP), you must deploy an Oracle XE Docker container on a Compute Engine (GCE) VM and then run the required SQL files to create and initialize the Pentaho database schemas and tables.

circle-info

Note: If you want to use a MySQL, PostgreSQL, or SQL Server database, see Create a Google Cloud supported database in GCP.

To create an Oracle database, complete the following steps:

  1. Provision a GCE VM with type e2-medium.

  2. On your local machine, open a command prompt.

  3. Connect to the GCE VM by running the following command, replacing <instance name>, <project>, and <zone>:

  4. Install and enable Docker on the VM by running the following commands:

  5. Download the Oracle XE image and start the Oracle XE container in the VM by running the following commands, replacing <password>:

  6. Verify that SQL*Plus can connect from inside the container to the Oracle XE database in the same container by running the following command, replacing <username>, <password>, and <service name>:

  7. Create a firewall rule in the Google Cloud project’s VPC network by running the following command:

  8. Add a network tag to the VM so that firewall rules targeting that tag apply to it by running the following command, replacing <zone>:

    You can now connect to the Oracle XE database from SQL Developer by using the VM’s external IP address.

  9. Upload the .sql files to the VM by running the following command, replacing <vm name>, <vm user>, and <zone>:

  10. Open an SSH session to the VM by running the following command, replacing <vm name>, <project id>, and <zone>:

    You are connected to the VM’s shell.

  11. Copy the uploaded .sql files into the Docker container by running the following commands, replacing <vm user>:

  12. Create and initialize required Pentaho database schemas and tables by running the following command, replacing <password>:

    The Oracle database is deployed and the Pentaho database schemas and tables are initialized.

Configure storage in GCS bucket

To configure storage in your Google Cloud Storage (GCS) bucket, complete the following steps:

  1. In your working directory, go to the gcp-11.0.0.0-<build number>/dist/aws/pentaho-server subdirectory.

    circle-info

    Note: You downloaded and extracted the aws-11.0.0.0-<build number>.zip file in the previous sections, Download Docker container files and Create a database instance in AWS.

  2. Open one of the following directories based on your database type:

    • pentaho-server-mysql

    • pentaho-server-oracle

    • pentaho-server-postgres

    • pentaho-server-sqlserver

  3. Open the .yaml file in a text editor and configure values for your environment. You must update the following values before deploying the Pentaho Server:

    The contents of the .yaml file vary slightly for each database and include comments to assist you with editing the file. The following code is an example of the .yaml file contents used for a PostgreSQL database when deploying to GCP.

  4. Save and close the .yaml file.

  5. Go to the softwareOverride/2_repository/tomcat/webapps/pentaho/META-INF subdirectory.

  6. Open the context.xml file in a text editor and update the database URL everywhere it appears. The following code is an example of the content in the context.xml file for a PostgreSQL database. In this example, the database instance URL is jdbc:postgresql://repository:5432/hibernate.

  7. (Optional) To install Pentaho Data Integration plugins during deployment, place the directory for each plugin in the softwareOverride\2_repository\pentaho-solutions\system subdirectory for your database type. The following subdirectories are needed for installing the plugins:

    • pentaho-interactive-reporting

    • dashboards

    • analyzer

    circle-info

    Note: For instructions downloading plugin files, see Download plugin files.

  8. (Optional) To change the default administrator password before deploying the Pentaho Server and database, go to the in the softwareOverride\2_repository\pentaho-solutions\system subdirectory and update the defaultUser.spring.properties and repository.spring.properties files. For instructions, see Change the default administrator password.

  9. Add a firewall rule in the Google Cloud Console allowing the GKE nodes to communicate with the Pentaho Server pod, enabling the pod to reach the external database.

  10. In your GKE cluster, create an GCS bucket for each of the following subdirectories that appear in the gcp-11.0.0.0-<build number>/dist/gcp/pentaho-server/<database> directory:

    1. config

    2. logs

    3. softwareOverride

  11. Upload the contents of each subdirectory to the corresponding GCS bucket that you created for them.

  12. Go back to the gcp-11.0.0.0-<build number>/dist/aws/pentaho-server/<database> directory, which contains your database .yaml file, and open a command prompt from that directory.

  13. In the command prompt, create the persistent volumes and persistent volume claims, then deploy the Pentaho Server to the Kubernetes cluster by running the following command. Replace <database> with the database name defined in the .yaml file:

    The Pentaho Server is deployed with the storage you configured in the GCS bucket.

Verify Pentaho Server deployment to GCP

To verify Pentaho Server deployment to GCP, complete the following steps:

  1. In a command prompt, run the following commands, in order:

  2. Foward the port for the Pentaho Server by running the following command, replacing the <pod name>, <port>, and <namespace>:

    circle-info

    Note: The default port is 8080.

  3. Verify that Pentaho Server is running by accessing it at http://<your external ip>:<port>/pentaho or your Application Load Balancer URL.

Deploy on Microsoft Azure

To deploy on Microsoft Azure using Azure Kubernetes Service (AKS), you must complete the following tasks, in order:

Best practices

  • Use Azure Managed Disks or Azure Files to provide persistent storage and mount these volumes into the container.

  • Deploy Pentaho Server behind an Azure Application Gateway to enable centralized TLS/SSL termination and traffic routing.

  • For production environments, use a managed database service such as Azure Database for PostgreSQL, MySQL, or SQL Server.

Before you begin

Before you can deploy to Microsoft Azure, complete the following tasks:

  • Create a Docker account.

    circle-info

    Note: For help with Docker and Docker Hub, see the online documentation at https://docs.docker.com/arrow-up-right.

  • Verify you have access to a standard Azure Kubernetes Service (AKS) cluster.

  • Verify that your Azure account has the required permissions to create resource groups, databases, container registries, storage accounts, namespaces, and Kubernetes services.

  • If you plan to pull Pentaho images from Azure Container Registry (ACR), or a mirrored ACR, verify that your Azure account is assigned the AcrPull role so it can pull images from ACR.

  • Install kubectl and Azure CLI.

  • Set up your local kubeconfig so that kubectl can communicate with the cluster by running the following command:

  • Download both the ZIP file that contains the Docker image and the ZIP file that contains the configuration files for your environment. For instructions downloading Docker files, see the previous section, Download Docker container files.

Tag and push Pentaho Server Docker image to Microsoft ACR

To tag the Pentaho Server Docker image and upload it to ACR, complete the following steps:

  1. In your working directory, open a command prompt.

  2. (Optional) If you are not logged in, log into Docker Hub using the following command: docker login.

  3. Authenticate to ACR by running the following commands, replacing <registryName> with the name of your registry:

  4. Tag and push the Pentaho Server Docker image by running the following commands, replacing <image name>, <registry name>, and <image name on acr> with the values from the downloaded file and your Azure account:

  5. Verify the Pentaho Server Docker image is uploaded to ACR by using the following command, replacing <registry name> and <repository name>:

    The Pentaho Server Docker image is tagged and pushed to ACR.

Edit database and storage configuration files for Azure

To edit database and storage configuration files for Azure, complete the following steps:

  1. On your local workstation, extract the azure-11.0.0.0-<build number>.zip file to a temporary working directory.

    circle-info

    Note: For instructions on downloading the ZIP file, see the previous section, Download Docker container files.

  2. In your working directory, go to the azure-11.0.0.0-<build number>/dist/azure/pentaho-server subdirectory.

  3. Open one of the following directories based on your database type:

    • pentaho-server-mysql

    • pentaho-server-oracle

    • pentaho-server-postgres

    • pentaho-server-sqlserver

  4. Open the .yaml file in a text editor and configure values for your environment. You must update the following values before deploying the Pentaho Server:

    The contents of the .yaml file vary slightly for each database and include comments to assist you with editing the file. The following code is an example of the .yaml file contents used for a PostgreSQL database when deploying to Azure.

  5. Save and close the .yaml file.

  6. Go to the /softwareOverride/2_repository/tomcat/webapps/pentaho/META-INF/ directory.

  7. Open the context.xml file in a text editor and update the database URL everywhere it appears. The following code is an example of the content in the context.xml file for a PostgreSQL database. In this example, the database instance URL is jdbc:postgresql://repository:5432/hibernate.

  8. Save and close the context.xml file.

  9. (Optional) To install Pentaho Data Integration plugins during deployment, place the directory for each plugin in the softwareOverride\2_repository\pentaho-solutions\system subdirectory for your database type. The following subdirectories are needed for installing the plugins:

    • pentaho-interactive-reporting

    • dashboards

    • analyzer

    circle-info

    Note: For instructions downloading plugin files, see Download plugin files.

  10. (Optional) To change the default administrator password before deploying the Pentaho Server and database, go to the in the softwareOverride\2_repository\pentaho-solutions\system subdirectory and update the defaultUser.spring.properties and repository.spring.properties files. For instructions, see Change the default administrator password.

Upload database and storage configuration files to Azure

In Azure Storage Explorer upload the following directories from your database directory to the file share in your Azure storage account:

  • config

  • db_init_<database>

  • logs

  • softwareOverride

circle-info

Note: For information about working with Azure Storage Explorer, see the online documenation at https://learn.microsoft.com/azurearrow-up-right.

Create a database instance in Azure

  1. In the Microsoft Azure portal, create a database instance for your database type.

  2. Run the .sql scripts in the db_init_<database> directory that you uploaded in the previous section.

circle-info

Note: For information about creating databases in Microsoft Azure, see the online documentation at https://learn.microsoft.com/azurearrow-up-right.

Deploy the Pentaho Server as a container in Azure

To deploy the Pentaho Server as a container in Azure, complete the following steps:

  1. Create the Pentaho Server container and database volume by running the following command, replacing <database>:

  2. Verify that Pentaho Server is running by accessing it at http://<vm ip>:<port>/pentaho or your Azure Application Gateway URL.

    circle-info

    Note: The default port is 8080.

Before the database container and volume are created, you can change the default administrator password by updating properties files that are referenced during deployment.

circle-info

Notes:

  • After deployment, you can change the administrator password in the Pentaho User Console. For instructions, see Change your password.

  • If you have an existing Pentaho Server or Pentaho Data Integration installation, you can use the encr.bat command to generate an encrypted password. For details, see Encrypting a password.

To change the default administrator password, complete the following steps:

  1. In your working directory, go to the pentaho-server subdirectory.

  2. Open one of the following directories based on your database type:

    • pentaho-server-mysql

    • pentaho-server-oracle

    • pentaho-server-postgres

    • pentaho-server-sqlserver

  3. Go to the system directory: softwareOverride/4_others/pentaho-solutions/system.

  4. Open the defaultUser.spring.properties file in a text editor and edit the defaultAdminUserPassword property value.

  5. Save and close the defaultUser.spring.properties file.

  6. Open the repository.spring.properties file in a text editor and edit the systemTenantAdminPassword property value.

  7. Save and close the repository.spring.properties file.

    The defaultUser.spring.properties and repository.spring.properties files are now ready to upload during cloud deployment.

Install Pentaho Server plugins after deployment

After deploying the Pentaho Server as a container, the recommended way to install Pentaho Server plugins is to use the Plugin Manager. For instructions using the Plugin Manager to install plugins, see Step 4: Install Pentaho plugins.

However, if you cannot use the Plugin Manager, you can install plugins by placing the plugin files in the appropriate directory for your deployment and restarting the Pentaho Server.

Before you begin, you must download the Pentaho Server plugin files. For instructions downloading plugin files, see Download plugin files.

circle-info

Note: Plugins installed outside of the Plugin Manager might not be listed in the Plugin Manager and must be maintained manually.

To install Pentaho Server plugins after the Pentaho Server has been deployed as a container, complete the following steps:

  1. In your cloud deployment, go to the pentaho-server directory.

  2. Open one of the following subdirectories based on your database type:

    • pentaho-server-mysql

    • pentaho-server-oracle

    • pentaho-server-postgres

    • pentaho-server-sqlserver

  3. Go to softwareOverride/4_others/pentaho-solutions/system subdirectory.

  4. Copy one or more of following plugin directories and its contents to the softwareOverride/2_repository/pentaho-solutions/system directory in your cloud deployment:

    • pentaho-interactive-reporting

    • dashboards

    • analyzer

  5. Restart the Pentaho Server by using the commands for your environment:

    • On-premises

    • Amazon Web Services

    • Google Cloud Platform and Microsoft Azure

Software overrides and plugins

The ZIP file that you download to deploy Pentaho Server as a container includes a softwareOverride directory that you can use to inject or replace files at container startup.

The Pentaho Server softwareOverride directory, located at /path/dist/pentaho-server/<database>/softwareOverride, contains subdirectories whose contents are copied by the Docker entrypoint into the Pentaho installation directory, in numerical order.

For instance, the on-prem-11.0.0.0-<build number>/dist/on-prem/pentaho-server/pentaho-server-postgres/softwareOverride directory contains the following subdirectories:

  • 1_drivers

  • 2_repository

  • 3_security

  • 4_others

  • 99_exchange

Any file placed in one of the subdirectories of the softwareOverride directory is layered into the container during startup. For example, if you place Pentaho Server plugin files in the /path/dist/pentaho-server/<database>/softwareOverride/2_repository/pentaho-solutions/system subdirectory, those plugins are installed when the Pentaho Server is restarted. You can also use the softwareOverrides directory to change the default administrator password before deployment.

For instructions about software overrides in your specific environment, see the README.md file that is included in the softwareOverride directory.

Last updated

Was this helpful?