You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
latency +/- jitter seems to imply you flip a coin and choose latency + jitter if heads or latency - jitter if tails, but I'm guessing it's actually a uniform distribution between latency - jitter and latency + jitter? (Or is it a Gaussian or Laplace distribution? I think Laplace is more typical for modeling this kind of noise)
Also what happens if jitter > latency? Is it uniform in the range (0, latency+jitter)? Or is it uniform in the range (latency-jitter, latency+jitter) but with negative values getting clamped to 0? (In the latter case, the probability of having no latency at all is non-negligible)
A realistic network will have some packet loss which means every once in a while messages are much slower (2x-3x) than in the common case, but that doesn't really seem to be captured by the latency +/- jitter model (since the distribution is tightly bounded to a range about a single value). In order to better simulate this, can I add multiple latency toxics to the same proxy each with different toxicity levels and will they be independently sampled and have their latencies added together?
The text was updated successfully, but these errors were encountered:
latency +/- jitter seems to imply you flip a coin and choose latency + jitter if heads or latency - jitter if tails, but I'm guessing it's actually a uniform distribution between latency - jitter and latency + jitter? (Or is it a Gaussian or Laplace distribution? I think Laplace is more typical for modeling this kind of noise)
Also what happens if jitter > latency? Is it uniform in the range (0, latency+jitter)? Or is it uniform in the range (latency-jitter, latency+jitter) but with negative values getting clamped to 0? (In the latter case, the probability of having no latency at all is non-negligible)
A realistic network will have some packet loss which means every once in a while messages are much slower (2x-3x) than in the common case, but that doesn't really seem to be captured by the latency +/- jitter model (since the distribution is tightly bounded to a range about a single value). In order to better simulate this, can I add multiple latency toxics to the same proxy each with different toxicity levels and will they be independently sampled and have their latencies added together?
The text was updated successfully, but these errors were encountered: