Does a single Spark Scala script node allow multiple downstream documents?

Does a single Spark Scala script node allow multiple downstream documents?

book

Article ID: KB0083795

calendar_today

Updated On:

Products Versions
Spotfire Statistica 13.3

Description

For example, assume below (in-Spark) workspace, All nodes are using “Spark Scala Script”.

User-added image
First “Data_Prep”Node generates DataFrameA, DataFrameB.
DataFrameA can be set as Downstream of “Data_Prep”Node.
3rd node “Lasso_1st_round” needs the downstream of “FSL_1st_dound”Node and DataFrameB.

Is it possible to generate multiple downstream documents in a “Spark Scala Script”node? that is, set both DataFrameA and DataFrameB as downstream documents for "Data_Prep" node. 

Environment

Windows 7, Windows Server 2012 R2

Resolution

For Statistica version 13.3, We currently do not allow more than one downstream document from a single spark node.

As a workaround, User can break the node down into multiple ones to set each document as downstream and chain them together, for example, in above workspace, break "Data_Prep" node into two ("Data_Prep1" and "Data_Prep2"): one with DataFrameA as downstream document and the other with DataFrameB as downstream document, like below workflow.

User-added image

 

Issue/Introduction

This article answers the question if a single Spark Scala script node could allow multiple downstream documents.