Databases - Database connection configuration and temporary table naming

Databases - Database connection configuration and temporary table naming

book

Article ID: KB0082705

calendar_today

Updated On:

Products Versions
Spotfire Data Science 6.x

Description

Database connection configuration and temporary table naming

Issue/Introduction

Database connection configuration and temporary table naming

Resolution

Database connection configuration and temporary table naming

This page explains how Spotfire Data Science admins should approach connecting to their database data sources and how Spotfire Data Science avoids conflicts in temporary table naming.

1. Database data sources can be created by Spotfire Data Science admins. There are two approaches to configuring permissions for these data sources. The first option is for an admin to create a database data source (via the Add a Data Source dialog) and use a generic set of credentials that apply to everyone that accesses the database. The second option is to configure multiple database accounts through the Add a Data Source dialog. This is useful if there are groups of users that need to have access to different schemas. 

2. Temporary results created by intermediate operators have a default naming scheme which helps to avoid conflicts. The format contains the user id and the flow id separated by underscores. For example, alp@user_id_@flow_id_rowfil_0. The variables starting with '@' can be accessed on a per-workflow basis by navigating to Actions -> Workflow Variables on the workflow page. Users can modify this naming scheme with other workflow variables if needed. Because of this naming scheme, multiple users can run the same workflow and not experience name conflicts in their result tables.

3. These result tables are stored in the output schema defined in the operator's configuration options. Users can choose any schema they have access to. By default, this is the value stored in the @default_schema variable. This can be modified by navigating to Actions -> Workflow Variables. 

Note: To change the default schema on a global level:

a) Add this line at the bottom of $CHORUS_HOME/shared/ALPINE_DATA_REPOSITORY/configuration/alpine.conf:
alpine.db.default_schema="my_schema_name"

b) Run chorus_control.sh restart as the chorus user

c) Create a new workflow

All workflows created after this change will have the new @default_schema, but previously created workflows will not be affected. To change old workflows, edit the @default_schema workflow variable on the workflow in question.


 

Attachments

Databases - Database connection configuration and temporary table naming get_app