LogoLogo
CtrlK
Try Pentaho Data Integration and Analytics
9.3 Install
  • Pentaho Documentation
  • Install Pentaho Data Integration and Analytics
  • Getting started with Pentaho Data Integration and Analytics installation
  • Pentaho installation
    • Archive installation
      • Archive installation process
        • Before you begin
        • Prepare your Windows environment for an archive install
          • Process overview
          • Create Windows directory structure
          • Install Java
          • Install the Pentaho Repository host database
          • Download and unpack installation files
            • Step 1: Download files
            • Step 2: Unpack the Pentaho Server installation files
            • Step 3: Unpack the plugin files
            • Step 4: (Optional) Unpack the Operations Mart DDL files
            • Step 5: Verify directory structure
          • Set environment variables
          • Prepare your Pentaho Repository
        • Prepare your Linux environment for an archive install
          • Process overview
          • Create the Pentaho user
          • Create Linux directory structure
          • Install Java
          • Install the Pentaho Repository host database
          • Download and unpack installation files
            • Step 1: Download files
            • Step 2: Unpack the Pentaho Server installation files
            • Step 3: Unpack the plugin files
            • Step 4: (Optional) Unpack the Operations Mart DDL files
            • Step 5: Verify directory structure
          • Set environment variables
          • Advanced Linux considerations
            • Systems without video cards
            • Systems without X11
          • Prepare your Pentaho Repository
        • Use PostgreSQL as Your Repository Database (Archive installation)
          • Before you begin
          • Initialize PostgreSQL Pentaho Repository database
            • Step 1: Change default passwords
            • Step 2: Run SQL scripts
            • Step 3: Verify PostgreSQL initialization
          • Configure PostgreSQL Pentaho RepositoryDatabase
            • Step 1: Set up Quartz on PostgreSQL
            • Step 2: Set Hibernate settings for PostgreSQL
            • Step 3: Modify Jackrabbit repository information for PostgreSQL
          • Perform Tomcat-specific connection tasks
            • Step 1: Download driver and apply to the Pentaho Server
            • Step 2: Modify JDBC connection information in the Tomcat XML file
          • Start Your Server
        • Use MySQL as your repository database (Archive installation)
          • Before you begin
          • Initialize MySQL Pentaho Repository database
            • Step 1: Change default passwords
            • Step 2: Run SQL scripts
            • Step 3: Verify MySQL initialization
          • Configure MySQL Pentaho Repository Database
            • Step 1: Set up Quartz on MySQL
            • Step 2: Set Hibernate settings for MySQL
            • Step 3: Replace default version of audit log file with MySQL version
            • Step 4: Modify Jackrabbit repository information for MySQL
          • Perform Tomcat-specific connection tasks
            • Step 1: Download driver and apply to the Pentaho Server
            • Step 2: Modify JDBC connection information in the Tomcat XML file
          • Start Your Server
        • Use Oracle as Your Repository Database (Archive installation)
          • Before you begin
          • Initialize Oracle Pentaho Repository database
            • Step 1: Change default passwords
            • Step 2: Run SQL scripts
            • Step 3: Verify Oracle initialization
          • Configure Oracle Pentaho Repository database
            • Step 1: Set up Quartz on Oracle
            • Step 2: Set Hibernate settings for Oracle
            • Step 3: Replace default version of audit log file with Oracle version
            • Step 4: Modify Jackrabbit repository information for Oracle
          • Perform Tomcat-specific connection tasks
            • Step 1: Download driver and apply to the Pentaho Server
            • Step 2: Modify JDBC connection information in the Tomcat XML file
          • Start Your Server
        • Use MS SQL Server as your repository database (Archive installation)
          • Before you begin
          • Initialize MS SQL Server Pentaho Repository database
            • Step 1: Adjust MS SQL Server configuration settings
            • Step 2: Change default passwords
            • Step 3: Run SQL scripts
            • Step 4: Verify MS SQL Server initialization
          • Configure MS SQL Server Pentaho RepositoryDatabase
            • Step 1: Set up Quartz on MS SQL Server
            • Step 2: Set Hibernate settings for MS SQL Server
            • Step 3: Replace default version of audit log file with MS SQL Server version
            • Step 4: Modify Jackrabbit repository information for MS SQL Server
          • Perform Tomcat-specific connection tasks
            • Step 1: Download driver and apply to the Pentaho Server
            • Step 2: Modify JDBC connection information in the Tomcat XML file
          • Start Your Server
        • Starting the Pentaho Server after an archive installation
      • Manual installation of the Pentaho client tools
      • Troubleshooting
    • Manual installation
      • Before you begin
      • Manual installation process
        • Prepare your Windows environment for a manual installation
          • Process overview
          • Create Windows directory structure
          • Install Java
          • Install your web application server
          • Install the Pentaho Repository host database
          • Download and unpack installation files
            • Step 1: Download files
            • Step 2: Unpack the Pentaho Server installation files
            • Step 3: Unpack the plugin files
            • Step 4: (Optional) Unpack the Operations Mart DDL files
            • Step 5: Verify directory structure
          • Set environment variables
          • Prepare your Pentaho Repository
        • Prepare your Linux environment for a manual installation
          • Process overview
          • Create the Pentaho user
          • Create Linux directory structure
          • Install Java
          • Install your web application server
          • Install the Pentaho Repository host database
          • Download and unpack installation files
            • Step 1: Download files
            • Step 2: Unpack the Pentaho Server installation files
            • Step 3: Unpack the plugin files
            • Step 4: (Optional) Unpack the Operations Mart DDL files
            • Step 5: Verify directory structure
          • Set environment variables
          • Advanced Linux considerations
            • Systems without video cards
            • Systems without X11
          • Prepare your Pentaho Repository
        • Use PostgreSQL as your repository database (Manual installation)
          • Before you begin
          • Initialize PostgreSQL Pentaho Repository database
            • Step 1: Change default passwords
            • Step 2: Run SQL scripts
            • Step 3: Verify PostgreSQL initialization
          • Configure PostgreSQL Pentaho Repository database
            • Step 1: Set up Quartz on PostgreSQL
            • Step 2: Set Hibernate settings for PostgreSQL
            • Step 3: Modify Jackrabbit repository information for PostgreSQL
          • Using JBoss
          • Perform Tomcat-specific connection tasks
            • Step 1: Download driver and apply to the Pentaho Server
            • Step 2: Modify JDBC connection information in the Tomcat XML file
          • Next steps
        • Use MySQL as your repository database (Manual installation)
          • Before you begin
          • Initialize MySQL Pentaho Repository database
            • Step 1: Change default passwords
            • Step 2: Run SQL scripts
            • Step 3: Verify MySQL initialization
          • Configure MySQL Pentaho Repository Database
            • Step 1: Set up Quartz on MySQL
            • Step 2: Set Hibernate settings for MySQL
            • Step 3: Replace default version of audit log file with MySQL version
            • Step 4: Modify Jackrabbit repository information for MySQL
          • Using JBoss
          • Perform Tomcat-specific connection tasks
            • Step 1: Download driver and apply to the Pentaho Server
            • Step 2: Modify JDBC connection information in the Tomcat XML file
          • Next steps
        • Use Oracle as your repository database (Manual installation)
          • Before you begin
          • Initialize Oracle Pentaho Repository database
          • Configure Oracle Pentaho Repository database
          • Using JBoss
          • Perform Tomcat-specific connection tasks
          • Next steps
        • Use MS SQL Server as your repository database (Manual installation)
          • Before you begin
          • Initialize MS SQL Server Pentaho Repository database
          • Configure MS SQL Server Pentaho RepositoryDatabase
          • Using JBoss
          • Perform Tomcat-specific connection tasks
          • Next steps
        • Prepare JBOSS connections and web app servers
          • Install JDBC driver as a module in JBoss
            • Step 1: Create module file for Pentaho Repository database
            • Step 2: Create module file for HSQL database
            • Step 3: Create module file for H2 database
            • Step 4: Define JNDI database connection information in JBoss
            • Step 5: Update Pentaho Repository configuration to use JNDI connection
            • Step 6: Add JBoss Deployment Structure file to the WAR file
            • Step 7: Remove JNDI resource references in JBoss
            • Step 8: Update JNDI data source reference to conform to JBoss standards
            • Step 9: Enable JBoss classloader to load classes from Sun JDK
            • Step 10: Update the security file
          • Prepare JBoss web application servers
            • Step 1: Increase the amount of time JBoss allows for Pentaho Server deployment
            • Step 2: Disable the JBoss RESTEasy scan
            • Step 3: Set the location of the solutions directory
            • Step 4: Configure JBoss settings
            • Step 5: Configure Pentaho settings for JBoss
              • Adding JBoss logging
                • Add JBoss logging
                • Remove extra logging
                • Configure and start the Pentaho Server after manual installation
                  • Configure the Pentaho Server for Windows
                    • Step 1: Modify the Tomcat startup script
                    • Step 2: (Optional) run the Pentaho Server as a windows service
                    • Step 3: Modify the JBoss startup script
                    • Step 4: Install licenses
                    • Step 5: Start the Pentaho Server
                  • Configure the Pentaho Server for Linux
                    • Step 1: Modify the Tomcat startup script
                    • Step 2: Modify the JBoss startup script
                    • Step 3: Install licenses
                    • Step 4: Start the Pentaho Server
                  • Problems starting the Pentaho Server
            • Step 6: Complete the JBoss checklist
              • Verification checklist for JBoss connection tasks
          • Starting the Pentaho Server
          • Problems starting the Pentaho Server
      • Manual installation of the Pentaho Client Tools
      • Troubleshooting
    • Installation of the Pentaho design tools
      • Install the BA design tools
        • Choose an installation method
        • Install tools with the Installation Wizard
        • Perform a manual installation of the BA design tools
          • Step 1: Download files
          • Step 2: Unpack the files
          • Step 3: Install the design tools
        • (Optional) Install the Analysis Enterprise Edition plugin
      • Install the PDI tools and plugins
        • Choose an installation method
        • Use the Pentaho installation wizard to install PDI client, utilities, and plugins
        • Perform a manual installation of the PDI client, utilities, and plugins
          • Install the PDI client
          • Install PDI plugins
            • Perform a manual installation of the PDI plugins
            • Visit the PDI Marketplace to install the PDI plugins
    • Docker container deployment of Pentaho
      • Before you begin
      • Docker container deployment process
        • Download and install Docker command line tool
        • Using the DockMaker command line tool
          • DockMaker command line tool examples
          • Docker command tool property and registry files
            • Docker command tool property file
              • Preset variables
              • Derived variables
              • Mandatory variables
              • Variable resolution order
            • Docker command tool registry file
        • Starting or stopping your Docker container
          • Starting or stopping a Pentaho Server container
          • Running a PDI container
          • Starting or stopping a Carte server container
          • Getting a command prompt on a container
      • Using your Docker containers with clusters
        • Shared volumes
          • Override Files volume
          • Metastore volume
            • Bind to a copy of your metastore
          • Database volume
        • Use the Docker command tool with a Kerberos secured cluster
      • Using DockMaker with service packs
    • Hyperscalers
      • Launching Pentaho in AWS
        • Find and configure the Pentaho AMI
        • Launch the Pentaho AMI instance
        • Apply licenses
        • See also
          • Administration of Pentaho in your instance
            • Pentaho Server
          • Upgrades
          • Client tools
    • Troubleshooting possible installation and upgrade issues
      • User Console themes render improperly after upgrade
      • Pentaho Server does not start when installed on a virtual machine
      • Context XML changes do not take effect after deploying a WAR
      • No search manager configured for the Java repository
      • Cannot create Jackrabbit tables in MySQL
      • Unable to use the database init scripts for PostgreSQL
      • JBoss fails to start
        • JBoss fails to start when the Pentaho HSQLDB sample database is running
        • JBoss fails to start after manually unpacking Pentaho WAR
      • Not able to login or run transformations
      • Error when using sample transformations or jobs
  • Components Reference
    • Server
    • Container deployment
    • Workstation
    • Embedded software
    • Application servers
    • Solution database repositories
    • Apache Hadoop vendors
    • Data Sources: General
      • Pentaho Tools
    • Big Data Sources: General
    • Big Data Sources: Details
    • SQL dialect-specific
    • Third-party libraries
    • Security
    • Java virtual machine
    • Web browsers
    • Support statement for Analyzer on Impala
    • Google BigQuery
    • Compatability issues running Pentaho on Java 11 with your Hadoop cluster
      • Cloudera 7.1 secured cluster
      • Hortonworks 3.1 secured cluster
      • Cloudera 6.3 secured cluster
      • Cloudera 6.2 secured cluster
      • Cloudera 6.1 secured cluster
  • JDBC drivers reference
    • Amazon Redshift
    • Apache Derby
    • Caché
    • CUBRID
    • Daffodil DB
    • DB2 AS/400
    • DB2 Universal Database
    • Firebird
    • FrontBase
    • Google BigQuery
    • Greenplum
    • H2 Database
    • Hive
    • Hive2
    • HSQLDB
    • Impala
    • Informix
    • Ingres
    • InterBase
    • jTDS Free MS SQL Sybase
    • MariaDB
    • MaxDB
    • Mckoi SQL Database
    • Mimer
    • MySQL
    • Neoview
    • Netezza
    • OpenBase SQL
    • Oracle
    • Pervasive
    • PostgreSQL
    • SAP ASE (formerly Sybase ASE)
    • SAP DB
    • SAP HANA
    • SAP SQL Anywhere
    • SmallSQL
    • Snowflake
    • SQLite
    • SQL Server
    • Teradata
    • Vertica
    • Install drivers with the JDBC distribution tool
  • Pentaho configuration
    • Tasks to be performed by an IT administrator
      • Configure the Pentaho Server
        • Start and stop the Pentaho Server for configuration
          • Starting and stopping the Pentaho Server on Windows
            • Pentaho Installation Wizard
            • Manual installation using your own respository
          • Starting and stopping the Pentaho Server on Linux
            • Pentaho Installation Wizard
            • Manual installation using your own repository
            • Pentaho Installation Wizard or manual installation: Starting on boot
        • Increase the Pentaho Server memory limit
          • Increase Pentaho Server memory limit for installations on Windows
          • Increase Pentaho Server memory limit for installations on Linux
          • Increase Pentaho Server memory limit for custom installations on Windows or Linux
        • Specify data connections for BA design tools
          • JDBC Database Connections
            • Define JDBC or OCI connections for BA design tools
              • Add drivers
            • Define JNDI connections for Report Designer and Metadata Editor
              • Add drivers
              • Edit the properties file for Report Designer
              • Edit the properties file for Metadata Editor
        • Specify data connections for the Pentaho Server
          • JDBC database connections
            • Set up native (JDBC) or OCI data connections for the Pentaho Server
              • Add drivers
                • Driver for Microsoft SQL Server
            • Set up JNDI connections for the Pentaho Server
              • Defining JNDI connections for PDI clients
              • Add drivers
                • Driver for Microsoft SQL Server
              • Tomcat JNDI connections
              • Install JDBC driver as a module in JBoss
                • Create module file for Pentaho Repository database
                • Create module file for HSQL database
                • Create module file for H2 database
                • Define JNDI database connection information in JBoss
                • Remove JNDI resource references in JBoss
                • Update JNDI data source references to conform to JBoss standards
      • Define security for the Pentaho Server
      • Use password encryption with Pentaho
        • Encrypting a password
        • Using encrypted passwords with Pentaho products
          • Encrypted passwords with Pentaho Data Integration
          • Encrypted passwords with the Pentaho User Console
          • Encrypted passwords with PUC email
          • Encrypted passwords with the Pentaho Aggregation Designer
          • Encrypted passwords with the Pentaho Metadata Editor
          • Encrypted passwords with the Pentaho Report Designer
      • Set up Pentaho to connect to a Hadoop cluster
        • Before you begin
          • Adding a new driver
            • Add a new driver
        • Install a driver for the Pentaho Server
        • Manually install a driver for the Pentaho Server
        • Additional configurations for specific distributions
          • Advanced settings for connecting to an Amazon EMR cluster
            • Before you begin
            • Edit configuration files for users
              • Verify or edit core-site XML file
              • Edit mapred-site XML file
            • Connect to a Hadoop cluster with the PDI client
            • Connect other Pentaho components to the Amazon EMR cluster
              • Create and test connections
          • Advanced settings for connecting to Azure HDInsight
            • Before you begin
              • Set up a secured instance
            • Edit configuration files for users
              • Edit Core site XML file
              • Edit HBase site XML file
              • Edit Hive site XML file
              • Edit Mapred site XML file
              • Edit YARN site XML file
            • Oozie configuration
              • Set up Oozie on a cluster
              • Set up Oozie on the server
            • Windows configuration for a secured cluster
            • Connect to HDI with the PDI client
            • Connect other Pentaho components to HDI
              • Create and test connections for other Pentaho components
          • Advanced settings for connecting to a Cloudera cluster
            • Before you begin
              • Set up a secured cluster
            • Edit configuration files for users
              • Edit Hive site XML file
              • Edit Mapred site XML file
              • Edit YARN site XML file
            • Oozie configuration
            • Windows configuration for a secured cluster
            • Connect to a Hadoop cluster with the PDI client
            • Connect other Pentaho components to the Cloudera cluster
              • Create and test connections
          • Advanced settings for connecting to Cloudera Data Platform
            • Before you begin
              • Set up a secured instance of CDP
            • Edit configuration files for users
              • Edit Core site XML file
              • Edit Hive site XML file
              • Edit Mapred site XML file
              • Edit YARN site XML file
            • Oozie configuration
            • Windows configuration for a secured cluster
            • Connect to CDP with the PDI client
            • Connect other Pentaho components to CDP
              • Create and test connections for other Pentaho components
          • Advanced settings for connecting to Google Dataproc
            • Before you begin
            • Create a Dataproc cluster
            • Install the Google Cloud SDK on your local machine
              • Set command variables
            • Set up a Google Compute Engine instance for PDI
            • Edit configuration files for users
              • Edit the XML file for MapReduce
            • Connect to a Hadoop cluster with the PDI client
            • Connect other Pentaho components to Dataproc
              • Create and test connections
          • Advanced settings for connecting to a Hortonworks cluster
            • Before you begin
              • Set up a secured cluster
            • Edit the configuration files for users
              • Edit HBase site XML file.md)
              • Edit Hive site XML file
              • Edit Mapred site XML file
              • Edit YARN site XML file
            • Oozie configuration
            • Windows configuration for a secured cluster
            • Connect to a Hadoop cluster with the PDI client
            • Connect other Pentaho components to the Hortonworks cluster
              • Create and test connections
            • Notes
              • Simba Spark SQL driver support
              • HDP 3.1 notes
                • Using the 3.0 driver for HDP 3.1 clusters
      • Set up the Adaptive Execution Layer (AEL)
        • Before you begin
          • Pentaho installation
          • Spark client
            • Install a new instance of the Spark client
            • Use a Spark client already installed on a cluster
          • Pentaho Spark application
        • Configure the AEL daemon for local mode
        • Configure the AEL daemon for YARN mode
        • Configure event logging
        • Vendor-supplied clients
          • Cloudera
          • Hortonworks
            • Use HBase with AEL and HDP
          • Amazon EMR
            • LZO support
            • Use HBase with AEL and Amazon EMR
          • Hive
            • Supported data types
            • Configure the AEL daemon for a Hive service
            • Configuring the AEL daemon for the Hive Warehouse Connector on your Hortonworks cluster
              • Before you begin
              • Configure the AEL daemon for the Hive Warehouse Connector
          • Google Cloud Storage
          • Google Dataproc
          • Microsoft Azure HDInsight
            • Use WASB with AEL
            • Use ADLS with AEL
        • Advanced topics
          • Spark Tuning
            • Audience and prerequisites
            • Opening the PDI step Spark tuning options
              • Setting PDI step Spark tuning options
            • Dataset tuning options
              • Steps using Dataset tuning options
            • Join tuning options
              • Steps using join tuning options
            • DataframeWriter tuning options
              • Steps using DataframeWriter tuning options
            • JDBC tuning options
              • Steps using JDBC tuning options
            • Configuring application tuning parameters for Spark
              • Audience and prerequisites
              • Spark tuning process
                • Application tuning parameters for Spark
                  • Set the Spark parameters globally
                    • Determining Spark resource requirements
                      • Example: Calculate your Spark application settings
                  • Set the Spark parameters locally in PDI
              • Optimizing Spark tuning
                • Step 1: Set the Spark parameters on the cluster
                • Step 2: Adjust the Spark parameters in the transformation
                • Step 3: Set the Spark tuning options on a PDI step in the transformation
            • About Spark tuning in PDI
              • Audience and prerequisites
              • Executing on the Spark engine
              • Tuning the Spark application parameters
                • Opening Spark application tuning
              • Setting the Spark tuning options on PDI steps
                • Opening the PDI step Spark tuning options
              • Example: Improving performance using Spark step tuning options
              • Cautions for Spark tuning
          • Configuring AEL with Spark in a secure cluster
            • Authentication with Kerberos
              • Setup a secure client connection
              • Setup a secure server connection
            • Using SSL encryption
              • Configure the daemon for SSL
        • Troubleshooting AEL
          • Steps cannot run in parallel
            • Get the step ID
              • Method 1: Retrieve the ID from the PDI client
              • Method 2: Retrieve the ID from the log
              • Method 3: Retrieve the ID from the PDI plugin registry
            • Add the step ID to the configuration file
              • Force coalesce and Spark tuning
          • Table Input step fails
            • Method 1: Load the data to HDFS before running the transform
            • Method 2: Increase the driver side memory configuration
            • Method 3: Adjust JDBC tuning options
          • User ID below minimum allowed
          • Hadoop version conflict
          • Hadoop libraries are missing
            • Add the class path
            • Set Spark home variable
          • Spark libraries conflict with Hadoop libraries
          • Failed to find AVRO files
          • Unable to access Google Cloud Storage resources
          • Unable to access AWS S3 resources
          • JAR file conflict in Kafka Consumer step
          • Internet Address data type fails
          • Message size exceeded
          • Spark SQL catalyst errors using the Merge or Group By steps
          • Performance or memory issues
          • Multiple steps in a transformation cannot generate files to the same location
          • Cannot read footer in a Spark file
          • Errors when using Hive and AEL on a Hortonworks cluster
            • Class not found exception
            • Hive database does not exist
          • Errors when using Hive and AEL on an Amazon EMR cluster
            • Date format does not work with the Table Output step
            • Transformation does not complete execution
            • Hive database does not exist
          • Driver timeout and deployment errors with AEL on secured clusters
            • Driver session timeout
            • Web socket deployment exception
          • Steps cannot run with Spark on AEL
    • Tasks to be performed by a Pentaho administrator
      • Define data connections
        • Open the connection dialog box
          • Open the Database Connection dialog box from PUC
          • Open the Database Connection dialog box from PDI
        • Enter database connection information
          • Native (JDBC) protocol information
          • OCI protocol information (PDI only)
          • Connect to an Azure SQL database
            • Before you begin
            • Authentication method
            • Connect to an Azure database
            • Use the Always Encryption Enabled option
        • Edit existing connections
          • Edit database connections in PUC
          • Edit database connections in PDI
        • Specify advanced configuration of database connections
        • Quoting
        • Set database-specific options
        • Define connection pooling
        • Connect to clusters (PDI only)
        • Modify connections
          • Modify connections from PUC
          • Modify connections from PDI
        • Delete connections
          • Delete connections from PUC
          • Delete connections from PDI
      • Assign permissions to use or manage database connections
      • Manage users and roles
        • With PUC
        • With the PDI client
          • Sample users, default roles, and permissions
          • Hide a user Home folder in PDI
          • Add users
          • Change user passwords
          • Delete users
          • Assign users to roles
          • Edit user information
          • Add roles
          • Edit roles
          • Delete roles
          • Make changes to the administrator role
          • Assign user permissions in the repository using the PDI client
          • Enable system role permissions
      • Configure the design tools and utilities
        • BA design tools
          • Start and stop BA design tools
            • Starting BA design tools
            • Stopping BA design tools
        • PDI design tools and utilities
          • Start and stop PDI design tools and utilities
            • Starting PDI tools and utilities
            • Stopping the PDI design tools
          • Increase the PDI client memory limit
            • Modify the PDI client startup script
  • Pentaho upgrade
    • Before you begin
    • Download the Pentaho upgrade file
    • Using the Pentaho Upgrade Installer
      • Get started by checking your environment
      • Specify customized items to address after upgrading
      • Back up your existing Pentaho products and install Pentaho 9.3
      • Restoring a Pentaho Upgrade Installer backup
        • Use the Pentaho Upgrade Installer to restore a backup
    • Post-upgrade tasks
      • Apply customizations
        • Address customizations with post-upgrade versions
      • Update the default documentation version link
      • Install Ops Mart
      • Install drivers for your Hadoop clusters
      • Apply your plugins
      • Setting up password encryption after upgrading
        • Modify the Tomcat context XML file
        • Update the Jackrabbit Repository XML file
        • Verify your Quartz properties
        • Update your Hibernate configuration
    • Using the Pentaho Upgrade Installer in console or silent mode
      • Before you begin
      • Use silent mode to upgrade
      • Exit codes
  • Multidimensional Data Modeling in Pentaho
    • Prepare your data
    • Dimensional modeling
    • Understanding data cubes
    • Map a model with Schema Workbench
    • Configure Mondrian engine
      • Manage Analyzer data sources
      • Edit the Mondrian Properties File
        • Performance
        • Mondrian Usability Properties
    • Mondrian cache control
      • Switch to Memcached
        • Switch from the Infinispan cache framework
          • Memcached Configuration Options
      • Segment cache architecture
        • How the Analysis engine uses memory
        • Cache control and propagation
      • Cache Configuration Files
      • Modify the JGroups configuration
      • Switch to another cache framework
        • Switch to Pentaho Platform Delegating Cache
        • Use a Custom SegmentCache SPI
    • Analysis schema security
      • Restrict Access to Specific Members
      • Mondrian role mapping in the Pentaho Server
        • The Mondrian One-To-One UseRoleMapper
        • The Mondrian-SampleLookupMap-UserRoleMapper
        • The Mondrian-SampleUserSession-UserRoleMapper
    • OLAP Log Output
      • Analysis SQL output logging
      • Enabling segment cache logging
      • View Log Output in Analyzer
    • Multidimensional Expression Language
      • MDX Syntax
      • Mondrian Schema Element Reference
        • AggExclude
          • Attributes
          • Constituent Elements
        • AggFactCount
          • Attributes
          • Constituent Elements
        • AggForeignKey
          • Attributes
          • Constituent Elements
        • AggIgnoreColumn
          • Attributes
          • Constituent Elements
        • AggLevel
          • Attributes
          • Constituent Elements
        • AggMeasure
          • Attributes
          • Constituent Elements
        • AggName
          • Attributes
          • Constituent Elements
        • AggPattern
          • Attributes
          • Constituent Elements
        • AggTable
          • Attributes
          • Constituent Elements
        • CalculatedMember
          • Attributes
          • Constituent Elements
        • CalculatedMemberProperty
          • Attributes
          • Constituent Elements
        • CaptionExpression
          • Attributes
          • Constituent Elements
        • Closure
          • Attributes
          • Constituent Elements
        • ColumnDef
          • Attributes
          • Constituent Elements
        • ColumnDefs
          • Attributes
          • Constituent Elements
        • Cube
          • Attributes
          • Constituent Elements
        • CubeGrant
          • Attributes
          • Constituent Elements
        • CubeUsage
          • Attributes
          • Constituent Elements
        • CubeUsages
          • Attributes
          • Constituent Elements
        • Dimension
          • Attributes
          • Constituent Elements
        • DimensionGrant
          • Attributes
          • Constituent Elements
        • DimensionUsage
          • Attributes
          • Constituent Elements
        • Formula
          • Attributes
          • Constituent Elements
        • Hierarchy
          • Attributes
          • Constituent Elements
        • HierarchyGrant
          • Attributes
          • Constituent Elements
        • InlineTable
          • Attributes
          • Constituent Elements
        • Join
          • Attributes
          • Constituent Elements
        • KeyExpression
          • Attributes
          • Constituent Elements
        • Level
          • Attributes
          • Constituent Elements
        • Measure
          • Attributes
          • Constituent Elements
        • MeasureExpression
          • Attributes
          • Constituent Elements
        • MemberGrant
          • Attributes
          • Constituent Elements
        • NamedSet
          • Attributes
          • Constituent Elements
        • NameExpression
          • Attributes
          • Constituent Elements
        • OrdinalExpression
          • Attributes
          • Constituent Elements
        • Parameter
          • Attributes
          • Constituent Elements
        • ParentExpression
          • Attributes
          • Constituent Elements
        • Property
          • Attributes
          • Constituent Elements
        • PropertyExpression
          • Attributes
          • Constituent Elements
        • Role
          • Attributes
          • Constituent Elements
        • RoleUsage
          • Attributes
          • Constituent Elements
        • Row
          • Attributes
          • Constituent Elements
        • Rows
          • Attributes
          • Constituent Elements
        • Schema
          • Attributes
          • Constituent Elements
        • SchemaGrant
          • Attributes
          • Constituent Elements
        • SQL
          • Attributes
          • Constituent Elements
        • Table
          • Attributes
          • Constituent Elements
        • Union
          • Attributes
          • Constituent Elements
        • UserDefinedFunction
          • Attributes
          • Constituent Elements
        • Value
          • Attributes
          • Constituent Elements
        • View
          • Attributes
          • Constituent Elements
        • VirtualCube
          • Attributes
          • Constituent Elements
        • VirtualCubeDimension
          • Attributes
          • Constituent Elements
        • VirtualCubeMeasure
          • Attributes
          • Constituent Elements
  • Relational Data Modeling in Pentaho
    • The Physical Layer
    • The Business View
    • The Abstract Business Layer
    • Incorporate Metadata
    • Build your models with the Pentaho Metadata Editor
  • Use Hadoop with Pentaho
    • Pentaho, big data, and Hadoop
      • Pentaho big data overview
      • About Hadoop
    • Get started with Hadoop and PDI
      • Before you begin
        • Configure PDI for Hadoop connections
          • Include or exclude classes or packages for a Hadoop configuration
        • Hadoop connection and access information list
          • Pentaho
          • Hadoop cluster
          • Optional Services
            • HDFS
            • Hive2 and Impala
            • HBase
            • Oozie
            • Pig
            • Pentaho MapReduce (PMR)
            • Sqoop
            • Spark
            • Zookeeper
      • Connect to your Hadoop clusters in the PDI client
      • Use PDI outside and inside the Hadoop cluster
    • Advanced topics
      • Copy files to a Hadoop YARN cluster
        • Add files to the YARN Workspace folder
        • Delete files from the YARN Workspace folder
      • PDI big data transformation steps
      • PDI big data job entries
      • Big data resources
    • Troubleshooting possible Big Data issues
      • General configuration problems
        • Driver and configuration issues
        • Connection problems
        • Directory access or permissions issues
        • Oozie issues
        • Zookeeper problems
        • Kafka problems
      • Cannot access cluster with Kerberos enabled
      • Cannot access the Hive service on a cluster
      • HBase Get Master Failed error
      • Sqoop export fails
      • Sqoop import into Hive fails
      • Pig job not executing after Kerberos authentication fails
      • Group By step is not supported in a single threaded transformation engine
      • Kettle cluster on YARN will not start
      • Hadoop on Windows
      • Spark issues
        • Steps cannot run in parallel
          • Get the step ID
            • Method 1: Retrieve the ID from the log
            • Method 2: Retrieve the ID from the PDI plugin registry
          • Add the step ID to the configuration file
        • Table Input step fails
          • Method 1: Load the data to HDFS before running the transform
          • Method 2: Increase the driver side memory configuration
        • User ID below minimum allowed
      • Legacy mode activated when named cluster configuration cannot be located
      • Unable to read or write files to HDFS on the Amazon EMR cluster
      • Use YARN with S3
      • Data Catalog searches returning incomplete or missing data
  • Using Spark Submit
    • Modify the sample Spark Submit job
      • Open and rename the job
      • Run the sample Spark Submit job
Powered by GitBook
On this page

Was this helpful?

Export as PDF
  1. Multidimensional Data Modeling in Pentaho
  2. Multidimensional Expression Language
  3. Mondrian Schema Element Reference

Level

Level of a hierarchy.

PreviousKeyExpressionNextAttributes

Last updated 1 month ago

Was this helpful?

LogoLogo

About

  • Pentaho.com

Support

  • Pentaho Support

Resources

  • Privacy

© 2025 Hitachi Vantara LLC