For example, assume below (in-Spark) workspace, All nodes are using “Spark Scala Script”.
First “Data_Prep”Node generates DataFrameA, DataFrameB.
DataFrameA can be set as Downstream of “Data_Prep”Node.
3rd node “Lasso_1st_round” needs the downstream of “FSL_1st_dound”Node and
DataFrameB.
Is it possible to generate multiple downstream documents in a “Spark Scala Script”node? that is, set both DataFrameA and DataFrameB as downstream documents for "Data_Prep" node.