Skip to main content

How to setup a local demo environment with dummy data

Prerequisites

  1. 🖥️ Running Exivity installation (this guide assumes a Windows deployment)
  2. 💿 Zip file containing the datasets and Transformer script (can be provided by Exivity support on request)

Create a folder and copy CSV files

  1. On the Exivity server, open the Windows Explorer and browse to your Exivity home folder (this is typically C:\Exivity\home if you used the defaults during installation, but may vary). It can also be found by navigating to %EXIVITY_HOME_PATH%.

  2. Within that home folder, create the following new directory:

    %EXIVITY_HOME_PATH%\exported\hybrid_demo
  3. Copy both CSV files from the provided demo zip into the newly created folder:

    %EXIVITY_HOME_PATH%\exported\hybrid_demo

Copy the Transformer script

  1. Locate the Hybrid_Cloud_Transformer.trs script from the demo zip.

  2. Copy this script to the Exivity transcript folder:

    %EXIVITY_HOME_PATH%\system\config\transcript

Execute the Transformer

  1. Log into the Exivity GUI (by default, this is via https://<YOUR-EXIVITY-HOST>).
  2. In the left-hand menu, navigate to Data Pipelines > Transformers.
  3. Locate and click on Hybrid_Cloud_Transformer (the one you copied in the previous step).
  4. In the Run tab of the Transformer, select a single date (or a small date range that includes today) and click Run now.
    • This step will create a new dataset called demo.hybrid (or a similarly named dataset specified in the script).

Create the Report Definition

  1. When the Transformer execution is finished, go to Data Pipelines > Reports.

  2. Click on Create to add a new report definition.

  3. Name the report Hybrid Cloud Costs (as in the example).

  4. Select demo.hybrid (or the name used in the Transformer script) from the Dataset dropdown.

  5. Under Reporting columns, configure the columns as follows (as per your screenshot):

    Key ColumnName ColumnLabelMetadata (optional)
    resellerresellerReseller(leave as default)
    customercustomerCustomer(leave as default)
    app_envapp_envApplication / Environment(leave as default)
  6. Click Create to finalize the new report definition.


Create a Workflow to automate daily runs

To automatically transform and prepare your Hybrid Cloud data for a longer report selection period, create a workflow with two steps:

  1. Transformer Step
    • Go to Data Pipelines > Workflows and click New.
    • Provide a name (e.g., Hybrid Cloud Daily).
    • In the Steps section, add a Transformer step:
      • Select Hybrid_Cloud_Transformer as the transformer.
      • Set From offset to -180 (or any desired range).
      • Set To offset to 0 (meaning “up to today”).
  2. Prepare Report Step
    • Add another step for Prepare Report:
      • Select Hybrid Cloud Costs as the report.
      • Use the same From offset and To offset as above (e.g., -180 to 0).

Your workflow steps might look like this (example):

Step TypeTransformer / Report NameFrom OffsetTo Offset
TransformerHybrid_Cloud_Transformer-1800
Prepare ReportHybrid Cloud Costs-1800

Finally, Save the workflow by clicking Create.


Run and monitor the workflow

  1. In Data Pipelines > Workflows, select your newly created workflow.
  2. Select a single day from the date picker, and click Run now to execute the workflow to load 180 days of data into the report.
  3. In the Status tab of the workflow, you can watch its progress in real time.
  4. Once finished, refresh your browser (or press F5) to ensure all new data is loaded.

View the report

  1. Return to Reports in the left-hand menu.
  2. Select Hybrid Cloud Costs from the drop-down list, if its not already selected.
  3. You can view your data under Accounts, Services, Instance, or Summary by picking a date range (e.g., the current month or the last couple of months).

(Optional) Adjust the workflow schedule

To keep the data up to date automatically:

  1. Go back to Data Pipelines > Workflows and edit your Hybrid Cloud Daily (or the name you gave it).
  2. Under Schedules, pick the frequency (e.g., daily) and time. For a rolling daily ingestion of new data, set From offset to -1 and To offset to 0 in both steps, so it only loads the new day’s data each day.