WITSML Adapter Samples

About the Samples

In these samples, the Spotfire Streaming Adapter for WITSML works to read, write, update, and delete data from a Wellsite Information Transfer Standard Markup Language (WITSML) store.

A running WITSML server must be available for these samples to run against.

In order to query for specific elements use the input schema and send in a tuple with any value to be queries as non null. For instance, to query for a specific well in the Well.sbapp you can run the sample and add a tuple on the ReadWellIn input. Then you specify the uid value and leave all the other fields blank. This is equivalent to a SQL query of Select schema fields from well where uid = myuid.

Samples Setup

Before running the samples, open the supplied src/main/configurations/streambase.conf and edit your applicable parameters for your WITSML server.

  1. SERVICE_URI: This is the full URI to your WITSML store.

  2. ENABLE_AUTH: Set this value to true if authentication is required for your WITSML store. Otherwise, leave as false.

  3. USER: Set this value to the required user name for your WITSML store. Otherwise leave as blank.

  4. PASS: Set this value to true for the required password for your WITSML data store. Otherwise leave as blank.

  5. TIMESTAMP_FORMAT: The timestamp format used by the WITSML server. The date format supplied must be a valid pattern for SimpleDateFormat

  6. REST_TIMESTAMP_FORMAT: The timestamp format used when sending requests from the web page with a date time.

  7. DELETE_ENABLED: By default the samples filter out delete requests for data safety. If you want actually delete data set this value to true.

Importing This Sample into StreamBase Studio

In StreamBase Studio, import this sample with the following steps:

  • From the top-level menu, select File>Import Samples and Community Content.

  • Enter witsml to narrow the list of options.

  • Select WITSML Read/Write/Delete adapters for well, wellbore, trajectory, and log from the IoT Adapters category.

  • Click Import Now.

StreamBase Studio creates a single project containing the sample files.

Running the Log Sample

  1. In the Project Explorer view, open the sample you just loaded.

    If you see red marks on a project folder, wait a moment for the project to load its features.

    If the red marks do not resolve themselves after a minute, select the project, right-click, and select Maven>Update Project from the context menu.

  2. Open the src/main/eventflow/packageName folder.

  3. Open the Log.sbapp file.

    This sample uses two sub-modules that perform operations on log data received from the server:

    1. convert-log-data-to-key-value-pair.sbapp

      This module is an example of how to convert the returned log data into key-value pairs. The data includes the data value along with the mnemonic associated with it. This module can also be updated to include the entire logCurveInfo tuple with each piece of data by changing the AGGREGATE_DEFINITION in the Parameters tab to one of the supplied TUPLE_WITH_LOG_CURVE_KEY or TUPLE_WITH_MNEMONIC_KEY.

    2. MnemonicReplacements.sbapp

      This module is an example of how to replace any mnemonic in the logCurveInfo with predefined mnemonics. This can be useful to standardize mnemonics across multiple systems. A CSV file of key-value pairs, Mnemonics.csv, which contains the replacements to use, is read in at system startup.

  4. Click the Run button. This opens the SB Test/Debug perspective and starts the module.

  5. In the Manual Input window, select an input stream to perform its associated action. Each input stream is labeled with the action it performs. For example, DeleteLogIn performs a delete operation based on the supplied data.

  6. Enter some data if required for the input stream and click the Send Data button to send a tuple into the system.

  7. In the Output Streams view, look for the status and data tuples for the selected operation.

  8. When done, press F9 or click the Terminate EventFlow Fragment button.

Running the Well Sample

  1. Continuing in this sample project, open the Well.sbapp file.

  2. Click the Run button. This opens the SB Test/Debug perspective and starts the module, which is self-running.

  3. In the Output Streams view, look for the status and data tuples for the selected operation.

  4. In the Manual Input view, select an input stream to perform its associated action. Each input stream is labeled with the action it performs. For example, DeleteWellIn performs a delete operation based on the supplied data.

  5. Enter some data if required for the selected input stream, and click Send Data to send a tuple into the system.

  6. In the Output Streams view, look for the status and data tuples for the selected operation.

  7. When done, press F9 or click the Terminate EventFlow Fragment button.

Running the Wellbore Sample

  1. Continuing in this sample project, open the Wellbore.sbapp file.

  2. Click the Run button. This opens the SB Test/Debug perspective and starts the module, which is self-running.

  3. In the Output Streams view, look for the status and data tuples for the selected operation.

  4. In the Manual Input view, select an input stream to perform its associated action. Each input stream is labeled with the action it performs. For example, DeleteWellboreIn performs a delete operation based on the supplied data.

  5. Enter some data if required for the selected input stream, and click Send Data to send a tuple into the system.

  6. In the Output Streams view, see the status and data tuples for the selected operation.

  7. When done, press F9 or click the Terminate EventFlow Fragment button.

Running the Trajectory Sample

  1. Continuing in this sample project, open the Trajectory.sbapp file.

  2. Click the Run button. This opens the SB Test/Debug perspective and starts the module, which is self-running.

  3. In the Output Streams view, look for the status and data tuples for the selected operation.

  4. In the Manual Input window, select an input stream to perform its associated action. Each input stream is labeled with the action it performs. For example, DeleteTrajectoryIn performs a delete operation based on the supplied data.

  5. Enter some data if required for the selected input stream, and click Send Data to send a tuple into the system.

  6. In the Output Streams view, see the status and data tuples for the selected operation.

  7. When done, press F9 or click the Terminate EventFlow Fragment button.

Running the Web Server Sample

  1. Continuing in this sample project, open the webserver.sbapp file.

  2. Click the Run button. This opens the SB Test/Debug perspective and starts the module, which is self-running.

  3. In the Output Streams view, look for the status and data tuples for the selected operation.

  4. Open a web browser and set the location to http://localhost:8080. From there you can query for wells, wellbores, and logs by specifying some criteria. The web server sample also uses port 8081 for all REST data requests.

  5. When done, press F9 or click the Terminate EventFlow Fragment button.

Running the Get Log Data Chunks Sample

This sample shows how you can create a query with multiple log requests in a single query call to the server. Each query made to the server requests a specific amount of data, depending on whether it is an index log or index time log. The query size can be adjusted by going to the Parameters tab and adjusting:

  • ReadIndexSize: The number of records to request for each log that is index based.

  • ReadTimeIndexSeconds: The number of seconds of data to request for each log that is time based.

  1. Continuing in this sample project, open the GetLogDataChunks.sbapp file.

  2. Click the Run button. This opens the SB Test/Debug perspective and starts the module, which is self-running.

  3. In the Output Streams view, look for the status and data tuples for the selected operation.

  4. When the application starts, it sends a query to the server to fetch all the start indexes for all the logs available and stores them in a Query Table.

  5. In the Manual Input window, select the ReadCurrentLogs input stream.

  6. Click Send Data to send a tuple into the system.

  7. In the Output Streams view, look for the status and data tuples for the selected operation.

  8. When done, press F9 or click the Terminate EventFlow Fragment button.

Sample Location

When you load the sample into StreamBase® Studio, Studio copies the sample project's files to your Studio workspace, which is normally part of your home directory, with full access rights.

Important

Load this sample in StreamBase® Studio, and thereafter use the Studio workspace copy of the sample to run and test it, even when running from the command prompt.

Using the workspace copy of the sample avoids permission problems. The default workspace location for this sample is:

studio-workspace/sample_adapter_embedded_witsml

See Default Installation Directories for the default location of studio-workspace on your system.