![]() |
Copyright © TIBCO Software Inc. All Rights Reserved |
This section describes a sample connection between two TIBCO Object Service Broker Data Object Brokers (node A and node B) on two Windows machines connected via TCP/IP. The sample would be the same for a Windows to Solaris connection. Node A is the local node and node B is the remote node with peer servers attached. This connection is referred to as an outgoing or outbound connection from node A.Refer to Managing Peer Servers for more information about configuring and operating peer-to-peer distributed data connections.
NODENAME=nodename nodename is the Data Object Broker name. In this sample configuration, NODENAME=A. remote_node is the Data Object Broker name of the node to which this server is connecting. fslevel should be 2. The following example corresponds to the sample configuration PEERS=(B,9,10,NTK,2). nn is the number of users needed to accomodate peer users. Valid values are 1 to 4096. MAXUSERS must be large enough to allow incoming peers to log in. Each incoming peer uses 2 user slots, for example, if the total incoming peers is equal to 10, 20 incoming slots of MAXUSERS should be devoted to incoming peers. In this sample configuration, MAXUSERS=32.To simplify the procedure, the same port number is used; the port numbers could differ if there is more than one Data Object Broker on a single machine.
Refer to Attributes Defined for details about the huron.dir entries.
NODENAME=nodename nodename is the Data Object Broker name of the node to which this server is connecting. In this sample configuration, NODENAME=B. remote_node is the Data Object Broker name. fslevel should be 2. The following example corresponds to the sample configuration PEERS=(A,9,10,NTK,2). nn is the number of users needed to accomodate peer users. Valid values are 1 to 4096. MAXUSERS must be large enough to allow incoming peers to log in. Each incoming peer uses 2 user slots, for example, if the total incoming peers is equal to 10, 20 incoming slots of MAXUSERS should be devoted to incoming peers. In this sample configuration, MAXUSERS=32.
2. Ensure you have the same entries in your Data Object Broker directory file (huron.dir) as node A.
3. In your mon.prm file, specify the following values to the SERVERS Execution Environment parameter to define the peer servers, in the format SERVERS=’numberN sessionnameN’, where
Represents the number of inbound connections on all PEER statements, for example, 10. Represents a session defined to the NAME parameter in the session.prm file (refer to step #4.), for example, PEERSERV1. It defaults to DEFAULT0 if left blank. If you specify a value on node A, the @PEERSERVERID shareable tool, which is a system-interpreted session table, must contain an entry with that name in the SERVERID field.
4. In the session.prm file, specify the following Execution Environment parameter values:
The name of the session parameter assignment to be used, for example, PEERSERV1. The path for the server log, for example, D:\Ostar\log. [Optional] The name of the Execution Environment to be used.If EENAME is specified, you must also specify a corresponding NAME parameter in the ee.prm file.TIBCO Object Service Broker Shareable Tools for details about @PEERSERVERIDTIBCO Object Service Broker Parameters for detailed information about the Execution Environment and Data Object Broker parameters
From node A, you can now access TIBCO Object Service Broker distributed data on node B. You must have identical TIBCO Object Service Broker userids on both nodes with the same security access.TIBCO Object Service Broker Parameters for detailed information about the Execution Environment and Data Object Broker parameters.
![]() |
Copyright © TIBCO Software Inc. All Rights Reserved |