How to automatically append rows of data to an existing data table

How to automatically append rows of data to an existing data table

book

Article ID: KB0079015

calendar_today

Updated On:

Products Versions
Spotfire Analyst All Versions

Description

Assuming you have a data source which only contains the latest data (for example, last week's data) but does not retain the historical older data, this is one method to insert that data regularly to a table, continually appending the new rows to the existing data.

In this example, TableA has the initial data and is embedded in the analysis file.  TableB is based on an Information Link which contains only last week's data (no earlier historical data). If "Insert > Row" is used to add data from TableB to TableA, it will not retain the data from the prior week, and will only contain most recent week's worth of data. In order to capture that data, you can append all the rows of TableB to TableA before "B" is refreshed with new data and use Automation Services to execute an IronPython script which inserts rows from TableB to the embedded TableA according to the required schedule (weekly in this example).

Issue/Introduction

This article describes a method to automate the task of inserting rows of data, appending the results to an existing data table.

Resolution

You can use Automation Services to schedule a job which will refresh your dashboard after a specific time interval, for example every week. Here are the general steps:

1) Create TableA with any initial data and keep it embedded
2) Create TableB using the information link which contains only last week's data. Once the data table is added, switch the data table to embedded (Data Table Properties > General > Store Data: Embedded in analysis)
3) Create an IronPython script which will refresh TableB and insert rows to TableA. For example:
from Spotfire.Dxp.Data import AddRowsSettings
from Spotfire.Dxp.Data.Import import DataTableDataSource

#Source Data Table
sourceDataTable=Document.Data.Tables["TableB"]
dataSource=DataTableDataSource(Document.Data.Tables["TableB"])

#Append rows to new data table
destinationDataTable=Document.Data.Tables["TableA"]
rowsettings=AddRowsSettings(destinationDataTable,dataSource)
destinationDataTable.AddRows(dataSource,rowsettings)
sourceDataTable.Refresh()
4) Configure this IronPython script to execute on change in value of a document property. For example, name the document property "trigger" with default date time value of "1/1/1900 00:00" (Note: If you want to see exact time like exact minutes, seconds etc. change the document property data type to string. If the document property data type is datetime you will see the value getting updated as 3/21/2019 3:00:00 PM instead of 3/21/2019 3:25:20 PM)
5) Create an Automation Services job and schedule this job every week (or per the required schedule depending on the frequency of the data updates) using Windows Task Scheduler. Add two tasks:
    -> "Open Analysis from Library" and in the configuration block set trigger to the current date time (Note: there is space between {date} and {time}, if you miss the space then date time value would not be updated in document property). For example:
trigger="{date} {time}" ;
    -> "Save Analysis to Library"

Additional Information

Doc: How to execute a script on property change Wiki: How to create Configuration Block KB: 000021541 How to schedule an Automation Services job using the Microsoft Windows Task Scheduler KB: 000033036 JavaScript triggering an IronPython Script on load of analysis does not execute In Automation Services