Data Mart ETL Services

The Data Mart ETL services enable batch loading and aggregation of data into the iWD Data Mart. The Kettle ETL Service defines general ETL configuration, such as the database in which the aggregated data is stored. The Scheduled ETL Job Service defines execution characteristics (such as scheduling) for each specific ETL job.

Kettle ETL Service

The Kettle ETL service represents an embedded Kettle ETL Engine that runs ETL scripts for loading and aggregating task and configuration data into the iWD Data Mart.

In addition to the options described in Service Details, the following properties are configurable for the Kettle ETL Service:   

startAutomatically

Whether the service should be started automatically after the configuration deployment.

logLevel

The Service log level. This should be set to Default unless otherwise instructed by Genesys Technical Support. See "Service Log Levels" for a description of each log level. Depending on the setting of this property, additional logging properties might be available. Refer to "Logging Service" for descriptions of these common properties.

repositoryDirectory

The directory on the server in which iWD Data Mart ETL scripts are stored.

customTaskAttributeMapping

Up to 10 comma-separated names of a task's custom attribute that are loaded into task_fact custom attribute fields (CUSTOM_ATTRIBUTE1-10).

In order for Kettle to pick them up, it is necessary to create fields in the Event Log and Interaction Databases. In the rpt_interaction, rpt_esp and interactions tables, add the fields in the following format:
Name: <attribute_name>, type: varchar(length).

These fields should be added to the mappings in the Event Log DAP options in the esp-custom-data and itx-custom-data sections.

For example, in order to store a custom attribute with the name order_total in the iWD Data Mart, as a task custom attribute:

  1. Create a new column in the rpt_interaction table: order_total, type: varchar(50)

  2. Create a new column in the rpt_esp table: order_total, type: varchar(50)

  3. Create a new column in the interaction table: order_total, type: varchar(50)

  4. Create a new option in the esp-custom-data section of the Event Log DAP options: order_total=order_total

  5. Create a new entry option in the itx-custom-data section of the Event Log DAP options: order_total=order_total

  6. Add order_total to Kettle ETL service, customTaskAttributeMapping attribute list

customTaskDimensionMapping

Up to 5 comma-separated names of a task's custom attribute that are loaded into the CUSTOM_DIM dimension and associated to the task using the CUSTOM_DIM_KEY field.

In order for Kettle to pick them up, it is necessary to create fields in the Event Log and Interaction Databases. In the rpt_interaction, rpt_esp and interactions tables, add the fields in the following format:
Name: <attribute_name>, type: varchar(length).

These fields should be added to the mappings in the Event Log DAP options in the esp-custom-data and itx-custom-data sections.

For example, in order to store a custom attribute with the name country_of_origin in the iWD Data Mart, as a part of task custom dimension:

  1. Create a new column in the rpt_interaction table: country_of_origin, type: varchar(50)

  2. Create a new column in the rpt_esp table: country_of_origin, type: varchar(50)

  3. Create a new column in the interactions table: country_of_origin, type: varchar(50)

  4. Create a new option in the esp-custom-data section of the Event Log DAP options: country_of_origin=country_of_origin

  5. Create a new option in the itx-custom-data section of the Event Log DAP options: country_of_origin=country_of_origin

  6.  Add country_of_origin to Kettle ETL service, customTaskDimensionMapping attribute list

customTenantAttributeMapping

Up to 5 comma-separated names of a tenant's custom attribute that are loaded into the CUSTOM_DIM dimension and associated to the task using the CUSTOM_DIM_KEY field.

customDepartmentAttributeMapping

Up to 5 comma-separated names of a department's custom attribute that are loaded into the CUSTOM_DIM dimension and associated to the task using the CUSTOM_DIM_KEY field.

customProcessAttributeMapping

Up to 5 comma-separated names of a process's custom attribute that are loaded into the CUSTOM_DIM dimension and associated to the task using the CUSTOM_DIM_KEY field.

detailsExpirationDays

The number of days after which the detailed task (task_fact, task_event_fact, and task_work_fact) data is removed from the database.

This is for historical data only.

aggregation15minExpirationDays

The number of days after which the data is removed from 15 minute aggregation tables.

This is for historical data only.

datamartDatabase

Mandatory dependency: Database in which to load and aggregate reportable data (Data Mart).

statisticsAdapter

Optional dependency: Statistics Adapter Service to use for the delivery of statistics.

timezone

The time zone that is used to store all date/time information in the fact tables.

Scheduled ETL Job

In addition to the options described in Service Details, the following properties are configurable for the Scheduled ETL Job service:  

startAutomatically

Whether the service should be started automatically after the configuration deployment.

logLevel

The Service log level. This should be set to Default unless otherwise instructed by Genesys Technical Support. See "Service Log Levels" for a description of each log level. Depending on the setting of this property, additional logging properties might be available. Refer to "Logging Service" for descriptions of these common properties.

triggerMode

How the ETL job is triggered (started):

  • Trigger using CRON expression: Starts a job automatically, based on a CRON scheduling expression.

  • Trigger after another scheduled service: Starts a job automatically after another scheduled job has finished.

  • Trigger manually: Job can be started manually on the Services Status screen.

triggerMode and startAutomatically

The startAutomatically property controls whether the service is started at all. If it is unchecked, the service will not start on any of these trigger events.

You can still start the service manually from the Services Status screen. In that case, the particular service is in a wait state. It will wait for the trigger event to happen in order to start.

Notes: If the triggerMode is set to Trigger manually and startAutomatically is unchecked, after deployment when you start the service it will be started but it will wait for a manual trigger. So, you will need to start it again so it can perform its tasks.

If startAutomatically is checked and the triggerMode is set to Trigger using CRON expression, the service will start according to the CRON expression.

cronExpression

Standard CRON scheduling expression when triggerMode is set to "Trigger using CRON expression".

For example, the following expression will cause the job to be executed every 15 minutes:

 "0 0,15,30,45 * * * ?".

For more information about CRON scheduling, see
http://www.quartz-scheduler.org/docs/tutorials/crontrigger.html

executionQueueName

The execution queue is configured only for jobs that are triggered with a cron expression or triggered manually.

For chained jobs (Trigger after another scheduled service), the execution queue automatically assumes the value of the parent job. For example, if you have Load Intraday chained to Load Config and Aggregate Intraday chained to Load Intraday, Load Config will never start while Aggregate Intraday is still running.

Scheduled services that have the same executionQueueName will never be run in parallel within the same solution.

If a scheduled service is triggered while another scheduled service that has the same executionQueueName is in progress, it is queued until the other service has completed its processing.

Note: This only works within the boundaries of a single runtime node. If you have services on different nodes, this will have no effect across the nodes.

etlService

Mandatory dependency: The Kettle ETL Service.

jobName

The name (type) of the ETL job. Please see the iWD Data Mart Reference Guide for detailed descriptions of the various types of iWD Data Mart ETL jobs.