Products | Versions |
---|---|
TIBCO Cloud | - |
Currently, it is not possible to run multiple instances of hybrid agent (tibagent) for log streaming. Some customers would like to run separate instances of hybrid agent( tibagent) per application for log streaming. In this scenario, customers can
https://support.tibco.com/s/article/Running-Hybrid-Agent-in-a-docker-container
Or
2. Use the below workaround to either use socat/fluentd as shared here.
In this article we are sharing a workaround that will allow multiple discovery keys with multiple hybrid agents (tibagent) running logstream on 1 machine in parallel. We have provided the workaround using socat and fluentd as a sample. You can use these or any other data collector tools to write logs to a file.
nohup ./tibagent start agent -s 7771:localhost:7772 yourAgent1 & nohup ./tibagent start agent -s 7771:localhost:7773 yourAgent2 & nohup ./tibagent start agent -s 7771:localhost:7774 yourAgent3 & |
Option 1: Start the socatlistener explicitly like below that writes the logstream to console:
socat tcp-listen:7772,reuseaddr,fork - socat tcp-listen:7773,reuseaddr,fork - socat tcp-listen:7774,reuseaddr,fork - |
Option 2: Start the socatlistener explicitly like below that writes the logstream to a file
socat -u tcp-listen:7772,reuseaddr,fork file:agent1.log,append socat -u tcp-listen:7773,reuseaddr,fork file:agent2.log,append socat -u tcp-listen:7774,reuseaddr,fork file:agent3.log,append |
Option 3: Install Fluent on the server and configure fluent.conf like this listening on multiple tcp ports (input plugin) and output plugin to a file or stdout as well. Start fluent to stream logs.
<source> @type tcp tag tciapp.logstream.7772 <parse> @type none </parse> port 7772 </source> <source> @type tcp tag tciapp.logstream.7773 <parse> @type none </parse> port 7773 </source> <source> @type tcp tag tciapp.logstream.7774 <parse> @type none </parse> port 7774 </source> <match **> #@type stdout @type file path /opt/dev/logstream/logs </match> |
Note:
1. The socat writing to the file should be able to run as a background process and use some linux log rotation tool to stop the infinite growing file.
2. If fluentd is used, its file output plugin will take care of file rotation out of the box.