To create a low-latency external client application:
- In the StreamBase Server, use the setting:
streambase.low-latency-mode=true
For details, refer to the Help under StreamBase Documentation > StreamBase References > StreamBase Java System Properties - In the client API (Java, C++, .NET):
- Use ClientSettings.setTcpNoDelay(true) (Java, C++) before creating new client objects. There is no equivalent API setting for .NET at this time so you have to set the following environment variable in the application's runtime environment:
- Do NOT turn on buffering (i.e. the StreamBaseClient.enableBuffering() method should not be used)
If the enqueue is infrequent, send a small message at regular intervals which will be discarded by the recipient to keep the channel active.
If you also need high-throughput, that is contrary to low-latency as throughput is best achieved when messages can be batched to avoid transmission overhead. Batching introduces latency. Identify your latency and throughput targets and don't over-optimize for one, or else you will harm the other.
When measuring latency:
- Allow for application startup costs. Data may be blocked by concurrent object instantiation, and the CPU may be taken by JIT compilation of active code paths. Generally for Java, JIT kicks in after a code path has been exercised 10,000 times, so be sure to warm up the application and discard latency values prior to JIT.
- Take lots of measurements, as individual measurements are often misleading.
- Find the maximum latency at the 90% percentile (discarding the 10% higher latencies seen) to identify how well the application will work most of the time.
- Find absolute maximum latency and decide whether it occurs at a point in processing that will break your application (e.g. not near startup or shutdown).
If you are over-achieving latency targets, consider turning on some batching/buffering to enable higher throughput if that is desired.