Here are the simplest set of steps to replace the decision table rule file or any artifact currently in use (example):
1. Load the artifact into the cluster epadmin --servicename=A.dt load artifact --type=SB_DECISION_TABLE --name=ApplicantSimple --version=2.0 --sourcefile=Applicant_Simple2.sbdt2. Activate the artifact for use in the cluster epadmin --servicename=A.dt activate artifact --type=SB_DECISION_TABLE --name=ApplicantSimple --version=2.03. Tell the operator to use the artifact epadmin --servicename=A.dt register artifact --action=register --operator=default.DecisionTable --type=SB_DECISION_TABLE --name=ApplicantSimple --version=2.0See documentation page:
Spotfire Streaming > EP Commands > epadmin and epdev References > epadmin-artifactExamine the artifact state in the running server using command: epadmin servicename=A.dt display artifactThese commands work whether or not you are also using an
Artifact Distribution Service (ADS) configuration and whether or not you are also using a
ModelOps or
Artifact Management Server (AMS). An ADS configuration associates specific artifacts found in the
src/main/resources folder of the project with specific operators.
Although these commands are addressed to a single node, they cause all nodes in the cluster with running application engines to update.
These commands do not remove the old artifact from the cluster. It may be used again with a new '
register artifact' command.
Registering an artifact with an operator only affects the running applications in the cluster. When a node is added to the cluster after these commands it will see all artifacts, but it will use the default artifact provided by that operator's configuration. Run the '
register artifact' command after the new application engine is running to update the new node. All operators in all running nodes are affected by any new '
register artifact' command, but if they are already using the new artifact this results in no changes to the operator.
The
Decision Table Control Port "
LoadRuleFile" command, at the time of this writing for Streaming versions 11.1.0 and prior, only recognizes artifacts from resource folders (see the '
deploydirectories' configuration option). Using the "
LoadRuleFile" command there is no way to refer to new artifacts activated using the '
epadmin ... artifact' commands.
Automating epadmin Commands
These commands can be automated within EventFlow using the
External Process Operator (EPO). Add the EPO to the application by opening a Module (.sbapp) editor, then drag and drop the "
Adapters, Java Operators" icon from the
Palette onto the editor canvas. Select from the dialog the "
External Process" operator (not the deprecated "
External Process command line" operator).
Select the EPO and on the
Command Arguments tab, set property "
Command line as expression" to value (example):
"D:/SOFTWARE/tibco/str/11.0/distrib/tibco/bin/epadmin --servicename="+getNodeName()+" load artifact --type="+type+" --name="+name+" --version="+version+" --sourcefile="+filepathThis should be equivalent to running from the command-line:
D:/SOFTWARE/tibco/str/11.0/distrib/tibco/bin/epadmin --servicename=A.X load artifact --type=SB_DECISION_TABLE --name=Applicant_Simple2 --version=2.0 --sourcefile=D:/models/Applicant_Simple2.sbdtThis assumes that "type", "name", "Version", and "filepath" are string fields provided in the input tuple. All the other settings are left with default values or selections. The full path to the '
epadmin' command is needed for environments in which the system PATH may not be fully configured. Any portion of this command may alternatively be supplied using
Parameters if desired (parameters cannot change after application start).
The "name" and "version" used in the command are arbitrary and used to identify this file from other loaded artifacts in the node. They do not need to match the filename or any embedded version number.
Add a separate EPO for each command and have them execute in sequence:
- epadmin load artifact
- epadmin activate artifact
- epadmin register artifact
Check the "exitcode" field after each command. If it is zero (0) then '
epadmin' thinks the command succeeded and it should be safe to execute the next command in the series.
Note that these '
epadmin' commands should be run in their own EventFlow region (using the "
Run this command asynchronously" option, a concurrent module, or from a separate StreamBase Container) in order to prevent a hang that occurs when the EPO owns the module lock and causes another operator in the same region to block waiting to emit a Status tuple and therefore blocks the EPO command from completing.
Correct output in the engine log looks like this:
2023-10-02 11:12:56.861000-0400 [4888:runtime [tid=11072]] INFO DecisionTable: DecisionTable Validating activation of artifact 'Applicant_Simple@2.0', type: 'sbdt', encoding: ''
2023-10-02 11:12:57.033000-0400 [4888:runtime [tid=11072]] INFO DecisionTable: DecisionTable Loading artifact 'Applicant_Simple@2.0', type: 'sbdt', encoding: ''
2023-10-02 11:12:57.033000-0400 [4888:runtime [tid=11072]] INFO DecisionTable: 6 rule(s) loaded in 0 ms from 'Applicant_Simple@2.0'
2023-10-02 11:12:57.033000-0400 [4888:runtime [tid=11072]] INFO com.tibco.ep.sb.ads.internal.CallNotifierDirectedTargetEx: Artifact [type=SB_DECISION_TABLE,name=Applicant_Simple,
version=2.0] delivered to operator default.DecisionTable to replace artifact [type=SB_DECISION_TABLE,name=DEFAULT-default_Applicant_Simple.sbdt-B5E3696370D07D2C33E3CBE00C54CF7B,
version=1.0]
Artifact Types
Table of Artifact TypesArtifact Type Code | Description |
---|
AVRO_SCHEMA | Avro schema file. Default extension: avsc |
COMPRESSED_SPARK_MODEL | Compressed Apache Spark model. Default extension: zip |
DECISION_TABLE | Decision table in TIBCO BusinessEvents format. Default extension: rulefunctionimpl |
DOMAIN_MODEL | Domain Model file for BusinessEvents. Default extension: domain |
EXCEL | Microsoft Excel file containing a decision table. Default extension: xlsx |
H2O_POJO | H2O Plain Old Java Object (numerical model). Default extension: pojo |
PREDICTIVE_MODEL | Predictive Model from PMML (Predictive Model Markup Language). Default extension: pmml |
PYTHON_SCRIPT | Python Script file. Default extension: py |
R_DATA | R Data file for TERR. Default extension: rdata (additional extension: rda) |
R_OBJECT | R Object file for TERR. Default extension: rds |
R_SCRIPT | R Script file for TERR. Default extension: r |
RULE_FUNCTION | Rule Function Implementation file for BusinessEvents. Default extension: rulefunction |
RULE_TEMPLATE | Rule Template file for BusinessEvents. Default extension: ruletemplate |
RULE_TEMPLATE_INSTANCE | Rule Template Instance file for BusinessEvents. Default extension: ruletemplateinstance |
SB_DECISION_TABLE | StreamBase decision table format. Default extension: sbdt |
SCALA | Scala file. Used in Spark and is the code used to produce a model. Deployment to StreamBase not recommended. It is intended to be kept as an artifact for users to possibly download the code and then (in Spark), compile it into a model, and then compress that into a COMPRESSED_SPARK_MODEL. The compiled model can then be used in non-StreamBase applications. Default extension: scala |
TENSORFLOW_MODEL | TensorFlow model file in Protocol Buffer format. Default extension: tfm (additional extension: pb |
TENSORFLOW_GRAPH | TensorFlow graph definition file in Protocol Buffer format. Default extension: tfg |
TEXT_FILE | Plain-text file. Default extension: txt |
OTHER | Catch-all for artifact types not built into Streaming |
Deploying Spark model artifacts to StreamBase 10.x and later is not supported at the time of this writing.
Legacy ModelOps, AMS, and ADS Configuration
If using the Artifact Management Server (AMS), TIBCO ModelOps, or its equivalent, then a manual push of an artifact from the web application executes the load/activate/register commands directly using the node "admin" REST API. The REST API is described here:
TIBCO Streaming > StreamBase Admin Guide > StreamBase Runtime REST API > REST APIIf using an ADS Configuration like this:
name = "Artifacts-conf"
type = "com.tibco.ep.streambase.configuration.ads"
version = "1.0.0"
configuration = {
Artifacts = {
artifacts = [
{
artifactSource = "dtrules/rules.sbdt"
latest = true
name = "dtrules/rules"
state = ACTIVE
type = SB_DECISION_TABLE
version = "1.0"
}
]
}
}
then there must be a default "
rules.sbdt" file, in the Streaming project
src/main/resources folder or other resource directories, under sub-directory "
dtrules" and any new artifact must include the "
dtrules/" prefix in its name when loaded using the '
epadmin' commands.
An AMS server organizes artifact files into projects, and optionally sub-directories within those projects. This file structure, "
project/folder/artifact", must be duplicated within the "
src/main/resources/" folder of the project in order to provide default versions of each artifact to manage. This is necessary since AMS may not be able to provide these artifacts when the application starts.
Typically an ADS configuration is only used when the Streaming application will only receive updated artifacts from well established AMS or ModelOps projects.
The small advantage of using an ADS configuration is that it adds a listener so that if an artifact is activated which matches the ADS configuration, then you do not need to also use the '
epadmin register artifact' command, but only the
load/activate commands when automating this process. This does limit the names and versions you can give an artifact for it to properly update the target operator.
For the
JPMML,
TensorFlow,
TERR, or
H2O operators, the
artifactSource path is defined on the
AMS tab,
Artifacts list, as so (example):
Artifacts: [ "project1/h20_A.pojo", "project2/pojos/h20_B.pojo" ]which would correspond to project paths "
src/main/resources/project1/h20_A.pojo" and "
src/main/resources/project2/pojos/h20_B.pojo".
The
Python operator,
AMS tab, only selects a single script artifact, identified by
Name and
Version. The
Name contains the project path, such as "
project1/scriptA.py".
For a
Decision Table (DT), the artifact (rule file) path is defined on the
Decision Table Settings tab.