The Workflows menu allows you to schedule various tasks and execute them at a specific date and time. This allows the execution of different Extractors and Transformers, so that they are tightly chained together.

Creating a Workflow

To create a new Workflow, go to 'Data Pipelines' > 'Workflows', then click the green button labbeled 'Create Workflow':

Creating a Workflow
  • Provide a meaningful Name for the Workflow

  • Optionally you can provide a detailed Description

  • The Start date of the Workflow, when it will run the first time

  • The interval: Hourly, Daily or Monthly

  • At what time should the Workflow start

  • Provide an interval value. I.e. provide 2 for every second Hourly / Daily / Monthly, depending on your configured interval

  • Add a new step in your work flow

  • Set the Type of the step using the drop down menu

  • Provide the option that goes with the selected Type. This can be your Extractor, Transformer, Report or other name

  • Depending on the selected Step Type, you can provide an offset date. This value is used during execution of that step. Typically this would be used for a From date offset (i.e. -1 for yesterday)

  • A To date offset can be provided for some step types (i.e. 0 for today)

  • In case multiple steps need to be executed in parallel, you may chose to remove the Wait checkbox.

  • Additional arguments can be send to the step. This applies only to some Extractors and when executing a custom Command

  • You can delete a step using the red minus button

  • To view historical Workflow results, click the Status tab

Special Workflow Steps

Apart from adding Extractor, Transformer and Report steps, there are two different Workflow Step types:

  • Core

  • Execute Command


The Core command allows you to run a few predefined API calls, currently these are the following:

Core API calls using workflow step
  • Run garbage collector

    • Cleans up the server getPrices cache table and Redis cache

  • Purge cache

    • This will Unprepare any prepared reports. Use with caution

  • Refresh budgets

    • Evalualte all configured budgets

  • Send heartbeat

    • Send an API heartbeat request. For future use.

Purge cache should be used with caution, as it will unprepare all available reports. This means that nonr of your reports will return any data, until you have prepared them again.

Execute command

The Execute Command step enables you to execute an external command, like a script:

Call external commands or scripts

As an example: you could run a Powershell script to obtains some data from a special data source that Exivity Extractors are not able or allowed to connect to. This script could be executed the following manner:

powershell.exe "D:\script\special.ps1" ((get-date).addDays(-1)).ToString("""yyyyMMdd""")

The above command calls the Powershell executable to run the special.ps1 script, with a dynamically generated parameter that is evaluated at run time. This particular example always provides yesterdays date in yyyyMMdd format as a parameter to the special.ps1 script. Many other variations and scripting languages are possible. Feel free to experiment.