Tool Operators
The Tools Operators are a collection of functions that are helpful with the Team Studio modeling process and with extending the functionality of Team Studio flows.
- Convert
Provides a method for converting a Hadoop CSV file into either Avro or Parquet format. - Export
Saves a trained model and stores it as a work file in the workspace. Models can also be stored in TIBCO® Streaming Artifact Management Server, if it is available and configured. - Export to Excel (DB)
Exports multiple inputs as separate tabs to an Excel Workbook stored in the current workspace. - Export to Excel (HD)
Exports multiple inputs as separate tabs to an Excel Workbook stored in the current workspace. - Export to FTP
Exports a single database table to an FTP or SFTP server. Supports password authentication. - Export to SBDF (DB)
Converts a database table to the Spotfire binary data frame (SBDF) format. The SBDF files are stored in the same workspace as the workflow and can be downloaded for use in TIBCO Spotfire. - Export to SBDF (HD)
Converts an HDFS tabular data set to the Spotfire binary data frame (SBDF) format. The SBDF files are stored in the same workspace as the workflow and can be downloaded for use in TIBCO Spotfire. - Flow Control
Provides a method for continuing or stopping flows depending on customized test conditions. - HQL Execute
Executes a user-defined HiveQL clause. - Load Model
Loads a Team Studio Analytics Model from the current workspace to use with predictor and evaluator operators. - Note
Embeds explanatory information within a workflow. - Pig Execute
Executes a user-defined Pig script (for parsing and sorting Hadoop data sources). The Pig Execute operator can also reference Pig UDFs (user-defined functions) that are supplied to the Team Studio server. - Python Execute (DB)
Runs a Jupyter notebook stored in your current workspace from a workflow in Team Studio. - Python Execute (HD)
Runs a Jupyter notebook stored in your current workspace from a workflow in Team Studio. - R Execute (DB)
To configure R Execute, connect a valid data source to the R Execute operator. An intermediate operator also constitutes a data source for R Execute. - R Execute (HD)
To configure R Execute, connect a valid data source to the R Execute operator. An intermediate operator also constitutes a data source for R Execute. - SQL Execute
Executes a user-defined SQL clause. - Sub-Flow
Allows you to incorporate another workflow within a parent workflow.
Copyright © 2021. Cloud Software Group, Inc. All Rights Reserved.