TensorFlow Model Evaluator Operator

Introduction

The TIBCO StreamBase® Operator For TensorFlow Model Evaluator enables StreamBase applications to execute TensorFlow models. TensorFlow is an open source library for numerical computation and large-scale machine learning. TensorFlow bundles together machine learning and deep learning (also know as neural networking) models and algorithms and makes them useful by way of a common metaphor.

The TensorFlow library incorporates different APIs to build at scale deep learning architecture such as Convolutional Neural Networks (CNN) or Recurrent Neural Networks (RNN). TensorFlow is based on graph computation; it allows the developer to visualize the construction of the neural network with TensorBoard, which is a TensorFlow program debugging tool. Additionally, TensorFlow is built to be deployed at scale. It can run on CPUs as well as GPUs.

The operator processes input data given as a tuple or a list of tuples. The tuple schema corresponds to the input parameters of the models. For each model, the operator generates output data that matches the defined output schema. Depending on the input data, the output can be a single tuple or a list of tuples.

Operator Properties

This section describes the properties you can set for this operator, using the various tabs of the Properties view in StreamBase Studio.

General Tab

Name: Use this required field to specify or change the name of this instance of this component, which must be unique in the current EventFlow module. The name must contain only alphabetic characters, numbers, and underscores, and no hyphens or other special characters. The first character must be alphabetic or an underscore.

Operator: A read-only field that shows the formal name of the operator.

Class name: Shows the fully qualified class name that implements the functionality of this operator. If you need to reference this class name elsewhere in your application, you can right-click this field and select Copy from the context menu to place the full class name in the system clipboard.

Start options: This field provides a link to the Cluster Aware tab, where you configure the conditions under which this operator starts.

Enable Error Output Port: Select this check box to add an Error Port to this component. In the EventFlow canvas, the Error Port shows as a red output port, always the last port for the component. See Using Error Ports to learn about Error Ports.

Description: Optionally enter text to briefly describe the component's purpose and function. In the EventFlow Editor canvas, you can see the description by pressing Ctrl while the component's tooltip is displayed.

Operator Properties Tab

Property Type Description
Control Port check box Enables dynamic reconfiguration of the model list. The control port also enables the control output port which reports status of the model loading request. The control port supports all-or-nothing semantics. That is, either the full list successfully loads and replaces the currently deployed models, or it reports failure.
Status Port check box Enables failure notifications. If the scoring fails, the failure is emitted to the status port, including the original input tuple.
Unicode Encoding String Used to encodes a String into a sequence of bytes and returns a byte array and vice versa, which is used internally to create String tensors.
Log Level INFO Controls the level of verbosity the adapter uses to issue informational traces to the console. This setting is independent of the containing application's overall log level. Available values, in increasing order of verbosity, are: OFF, ERROR, WARN, INFO, DEBUG, TRACE.

Models Tab

Property Type Description
Model Type radio button Model representation type.
  • Graph Definition — The GraphDef class is an object created by the ProtoBuf library from the definition in tensorflow/core/framework/graph.proto.

  • Saved Model — may represent multiple graph definitions as MetaGraphDef protocol buffers. Weights and other variables are not usually stored inside the file during training. Instead, they are held in separate checkpoint files.

Model Name String The model name.
Model URL String The URL pointing to the model definition. Models can also be loaded from HDFS.

Schemas Tab

Property Type Description
Output schema structure tuple Must contain two fields:
  • value — value of the result tensor.

  • shape — list of type long describing the result tensor's shape.

Result Data Schema schema Anticipated schema for model output. Only fields defined in the schema are used in the output tuple.

AMS Tab

Use the AMS tab to specify which artifacts should be pulled from a running TIBCO Artifact Management Server, which is a separately installed product.

Note

If you deploy an artifact from the AMS system, it will first check your list of artifacts to match the path and if matched will use the model name given. If the path is not matched, then the artifact's filename is used without the file extension as the model name. Example sample/audit.rds would resolve to a model name of audit.

Property Data Type Description
Required On Startup check box When enabled, the artifacts listed are requested from AMS at initialization and the system waits until all artifacts are loaded.
Artifacts list (string, string) List of artifacts to load from AMS. The first value of the path is the project name followed by the full path to the artifact. Use a / separator with an optional @version after the name. If @version is not specified, then the latest version is assumed.

For example: project/path1/path2/artifact@2

use -type at the end. By default, the type is set as TENSORFLOW_MODEL type.

You can specify a model type like this: project/path1/path2/graph_artifact-TENSORFLOW_GRAPH

Also, for zipped files, you can also specify a type, for example: project/path1/path2/zipped_graph-TENSORFLOW_GRAPH project/path1/path2/zipped_model-TENSORFLOW_MODEL

Type OTHER will always be accepted, but is treated by default as the TENSORFLOW_MODEL type.

Cluster Aware Tab

Use the settings in this tab to allow this operator or adapter to start and stop based on conditions that occur at runtime in a cluster with more than one node. During initial development of the fragment that contains this operator or adapter, and for maximum compatibility with TIBCO Streaming releases before 10.5.0, leave the Cluster start policy control in its default setting, Start with module.

Cluster awareness is an advanced topic that requires an understanding of StreamBase Runtime architecture features, including clusters, quorums, availability zones, and partitions. See Cluster Awareness Tab Settings on the Using Cluster Awareness page for instructions on configuring this tab.

Concurrency Tab

Use the Concurrency tab to specify parallel regions for this instance of this component, or multiplicity options, or both. The Concurrency tab settings are described in Concurrency Options, and dispatch styles are described in Dispatch Styles.

Caution

Concurrency settings are not suitable for every application, and using these settings requires a thorough analysis of your application. For details, see Execution Order and Concurrency, which includes important guidelines for using the concurrency options.

Data Input Port

The data port is the default input port for the model operator. It is always enabled. Use the data port to execute the model scoring.

The default schema for the data input port is:

  • frame, tuple or list(tuple). Samples to be scored by the deployed models.

    The tuple structure contains primitive fields (int, long, double, string or boolean) with names corresponding to model input fields.

  • place holder name : dataType(string). shape(list<long>). value(list<elementType>).

Unrecognized fields are transparently passed. The frame field is not propagated; the scores field is not allowed.

from StreamBase Tensor type
int tensor(int) --scalar (by default, converting to int32)
list<int> tensor(int[], shape(list<int>.size))
tuple(list<int> value, list<long> shape) tensor(int[], shape)
tuple(list<int> value, list<long> shape, string dataType) tensor({ int32, int64}[], shape) - Based on datatype name
double tensor(double) --scalar
list<double> tensor(double[], shape(list<double>.size))
tuple(list<double> value, list<long> shape) tensor(double[], shape)
tuple(list<double> value, list<long> shape, string dataType) tensor({double, float}[], shape) - Based on datatype name
long tensor(long) --scalar
list<long> tensor(long[], shape(list<long>.size))
tuple(list<long> value, list<long> shape) tensor(long[], shape)
tuple(list<long> value, list<long> shape, string dataType) tensor({uint8, int32, int64}[], shape) - Based on datatype name
blob tensor(byte[]) --scalar
list<blob> tensor(byte[][], shape(list<blob>.size))
bool tensor(boolean) --scalar
list<bool> tensor(bool[], shape(list<bool>.size))
string tensor(string) --scalar [converted to byte[] internally]
list<string> tensor(byte[][], shape(list<long>.size))
function not supported
capture not supported
timestamp not supported
from Tensor StreamBase type
tensor(int32) scalar tuple(int value, list<long> shape)
tensor(int32) matrix tuple(list<int> value, list<long> shape)
tensor(double) scalar tuple(double value, list<long> shape)
tensor(double) matrix tuple(list<double> value, list<long> shape)
tensor(float) scalar tuple(double value, list<long> shape)
tensor(float) matrix tuple(list<double> value, list<long> shape)
tensor(blob) scalar tuple(string/blob value, list<long> shape)
tensor(blob) matrix tuple(list<blob>/ list<string> value, list<long> shape)
tensor(long) scalar tuple(long value, list<long> shape)
tensor(long) matrix tuple(list<long> value, list<long> shape)
tensor(bool) scalar tuple(bool value, list<long> shape)
tensor(bool) matrix tuple(list<bool> value, list<long> shape)

Scores Output Port

The scores port provides a list of model evaluation results.

The schema for the scores output port is:

  • scores, list(tuple). List of record for each currently deployed model.

  • scores.modelName, string. Name of the model defined in the Model URLs or provided by the control port.

  • scores.modelUrl, string. URL defining the model configured in the Model URLs or provided by the control port.

  • scores.score, tuple or list(tuple). The type depends on the type of frame input. List of scores in the same order as the input list. The schema is defined as Result Data Schema property.

  • scores.*. Arbitrary parameters provided during model redeployment on the control port.

  • * parameters other than frame.

The scores port transparently replicates unrecognized fields. The frame field is not propagated.

Control Input Port

The control port enables runtime redeployment of models. The models are deployed in all-or-nothing semantics. That means if all the provided models are successfully loaded, they fully replace the current set.

The schema for the control input port is:

  • models, list(tuple). List of record for each model to be deployed.

  • models.modelName, string. Logical name of the model.

  • models.modelUrl, string. URL defining the model.

  • models.modelType, string. Graph Definition or Saved Model.

  • models.*. Arbitrary parameters describing the model. They are ultimately provided in the score.

  • *. Arbitrary parameters provided during model redeployment on the control port.

The status port transparently replicates unrecognized fields; do not use the status or message fields on the input port.

Status Output Port

The status port provides responses for runtime model deployment. The tuples are emitted only as responses to the control port tuples.

The schema for the status output port is:

  • status, string. Status of deployment. Either success or failure.

  • message, string. Descriptive status message.

  • models, list(tuple). List of record for each model to be deployed.

  • models.status, string. Status of the model loading. Either success or failure.

  • models.message, string. Descriptive model status message.

  • models.modelName, string. Logical name of the model.

  • models.modelUrl, string. URL defining the model.

  • models.*. Arbitrary parameters describing the model. They are later provided in the score.

  • * parameters other than models.

The status port transparently replicates unrecognized fields from the control port.