Contents
In these samples, the Spotfire Streaming Adapter for WITSML works to read, write, update, and delete data from a Wellsite Information Transfer Standard Markup Language (WITSML) store.
A running WITSML server must be available for these samples to run against.
In order to query for specific elements use the input schema and send in a tuple with
any value to be queries as non null. For instance, to query for a specific well in
the Well.sbapp
you can run the sample and add a tuple
on the ReadWellIn input. Then you specify the uid
value
and leave all the other fields blank. This is equivalent to a SQL query of
Select schema fields from well where uid = myuid
.
Before running the samples, open the supplied src/main/configurations/streambase.conf
and edit your applicable
parameters for your WITSML server.
-
SERVICE_URI: This is the full URI to your WITSML store.
-
ENABLE_AUTH: Set this value to true if authentication is required for your WITSML store. Otherwise, leave as false.
-
USER: Set this value to the required user name for your WITSML store. Otherwise leave as blank.
-
PASS: Set this value to true for the required password for your WITSML data store. Otherwise leave as blank.
-
TIMESTAMP_FORMAT: The timestamp format used by the WITSML server. The date format supplied must be a valid pattern for SimpleDateFormat
-
REST_TIMESTAMP_FORMAT: The timestamp format used when sending requests from the web page with a date time.
-
DELETE_ENABLED: By default the samples filter out delete requests for data safety. If you want actually delete data set this value to true.
In StreamBase Studio, import this sample with the following steps:
-
From the top-level menu, select
> . -
Enter
witsml
to narrow the list of options. -
Select WITSML Read/Write/Delete adapters for well, wellbore, trajectory, and log from the IoT Adapters category.
-
Click Import Now.
StreamBase Studio creates a single project containing the sample files.
-
In the Project Explorer view, open the sample you just loaded.
If you see red marks on a project folder, wait a moment for the project to load its features.
If the red marks do not resolve themselves after a minute, select the project, right-click, and select
> from the context menu. -
Open the
src/main/eventflow/
folder.packageName
-
Open the
Log.sbapp
file.This sample uses two sub-modules that perform operations on log data received from the server:
-
convert-log-data-to-key-value-pair.sbapp
This module is an example of how to convert the returned log data into key-value pairs. The data includes the data value along with the mnemonic associated with it. This module can also be updated to include the entire logCurveInfo tuple with each piece of data by changing the AGGREGATE_DEFINITION in the Parameters tab to one of the supplied TUPLE_WITH_LOG_CURVE_KEY or TUPLE_WITH_MNEMONIC_KEY.
-
MnemonicReplacements.sbapp
This module is an example of how to replace any mnemonic in the logCurveInfo with predefined mnemonics. This can be useful to standardize mnemonics across multiple systems. A CSV file of key-value pairs,
Mnemonics.csv
, which contains the replacements to use, is read in at system startup.
-
-
Click the Run button. This opens the SB Test/Debug perspective and starts the module.
-
In the Manual Input window, select an input stream to perform its associated action. Each input stream is labeled with the action it performs. For example, DeleteLogIn performs a delete operation based on the supplied data.
-
Enter some data if required for the input stream and click the Send Data button to send a tuple into the system.
-
In the Output Streams view, look for the status and data tuples for the selected operation.
-
When done, press F9 or click the Terminate EventFlow Fragment button.
-
Continuing in this sample project, open the
Well.sbapp
file. -
Click the Run button. This opens the SB Test/Debug perspective and starts the module, which is self-running.
-
In the Output Streams view, look for the status and data tuples for the selected operation.
-
In the Manual Input view, select an input stream to perform its associated action. Each input stream is labeled with the action it performs. For example, DeleteWellIn performs a delete operation based on the supplied data.
-
Enter some data if required for the selected input stream, and click
to send a tuple into the system. -
In the Output Streams view, look for the status and data tuples for the selected operation.
-
When done, press F9 or click the Terminate EventFlow Fragment button.
-
Continuing in this sample project, open the
Wellbore.sbapp
file. -
Click the Run button. This opens the SB Test/Debug perspective and starts the module, which is self-running.
-
In the Output Streams view, look for the status and data tuples for the selected operation.
-
In the Manual Input view, select an input stream to perform its associated action. Each input stream is labeled with the action it performs. For example, DeleteWellboreIn performs a delete operation based on the supplied data.
-
Enter some data if required for the selected input stream, and click
to send a tuple into the system. -
In the Output Streams view, see the status and data tuples for the selected operation.
-
When done, press F9 or click the Terminate EventFlow Fragment button.
-
Continuing in this sample project, open the
Trajectory.sbapp
file. -
Click the Run button. This opens the SB Test/Debug perspective and starts the module, which is self-running.
-
In the Output Streams view, look for the status and data tuples for the selected operation.
-
In the Manual Input window, select an input stream to perform its associated action. Each input stream is labeled with the action it performs. For example, DeleteTrajectoryIn performs a delete operation based on the supplied data.
-
Enter some data if required for the selected input stream, and click
to send a tuple into the system. -
In the Output Streams view, see the status and data tuples for the selected operation.
-
When done, press F9 or click the Terminate EventFlow Fragment button.
-
Continuing in this sample project, open the
webserver.sbapp
file. -
Click the Run button. This opens the SB Test/Debug perspective and starts the module, which is self-running.
-
In the Output Streams view, look for the status and data tuples for the selected operation.
-
Open a web browser and set the location to
http://localhost:8080
. From there you can query for wells, wellbores, and logs by specifying some criteria. The web server sample also uses port 8081 for all REST data requests. -
When done, press F9 or click the Terminate EventFlow Fragment button.
This sample shows how you can create a query with multiple log requests in a single query call to the server. Each query made to the server requests a specific amount of data, depending on whether it is an index log or index time log. The query size can be adjusted by going to the Parameters tab and adjusting:
-
ReadIndexSize: The number of records to request for each log that is index based.
-
ReadTimeIndexSeconds: The number of seconds of data to request for each log that is time based.
-
Continuing in this sample project, open the
GetLogDataChunks.sbapp
file. -
Click the Run button. This opens the SB Test/Debug perspective and starts the module, which is self-running.
-
In the Output Streams view, look for the status and data tuples for the selected operation.
-
When the application starts, it sends a query to the server to fetch all the start indexes for all the logs available and stores them in a Query Table.
-
In the Manual Input window, select the ReadCurrentLogs input stream.
-
Click
to send a tuple into the system. -
In the Output Streams view, look for the status and data tuples for the selected operation.
-
When done, press F9 or click the Terminate EventFlow Fragment button.
When you load the sample into StreamBase® Studio, Studio copies the sample project's files to your Studio workspace, which is normally part of your home directory, with full access rights.
Important
Load this sample in StreamBase® Studio, and thereafter use the Studio workspace copy of the sample to run and test it, even when running from the command prompt.
Using the workspace copy of the sample avoids permission problems. The default workspace location for this sample is:
studio-workspace
/sample_adapter_embedded_witsml
See Default Installation
Directories for the default location of studio-workspace
on your system.