-
Notifications
You must be signed in to change notification settings - Fork 88
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add index pattern / date math support to the index => setting #49
Comments
Some background: The While I agree what you propose would be nice, I'm not sure how to expose it to users. I feel that supporting the sprintf format would be confusing because, for example, a field reference wouldn't work -- Instead, what about doing this:
And having your query include your desired time range like '@timestamp:[now-1d TO now]` ? |
With newer versions of Elasticsearch, we could add some field_stats API magic to determine which indices to query, couldn't we? |
Hello I just saw this discussion, and from a user point of view it can be very useful to insert the data that was inserted a specific days instead of using the timestamp of the data in the index. E.g. You send the data (log) from ES to "other infra" with Logstash2 and '@timestamp:[now-1d/d TO now/d]' and the plugin has the following param. : schedule => "0 12 * * * America/Chicago" Let's say that Logstash1 stops inserting during 2 days (or less), and then start again. In this case you will miss some data, or you will need to change the Logstash2 configuration, play with the timestamp, and anyway there is a big risk to have duplicated/missing data in "other infra". |
Compliant with the statement here #92 you should support the form
additional (Date math support in index names). |
Can confirm that using date math in the Given that I am in UTC-5, and I ran this test at 7:25AM on 2019-03-28, with:
… the resulting index was created with the proper date math corrected time:
Add 5 hours to 7:25AM, and UTC at time of execution would be 12:25PM. Add 12 hours to that and it's 2019-03-29, as the create index name indicates. Please understand that this approach will force Elasticsearch to perform the date math calculations on every single event you send. The other way, Logstash does the work. I'm not sure what extra CPU cost this incurs, but it is a calculation, so there is some cost, even if it is negligible. |
I know, I shipped to Elasticsearch, rather than read from Elasticsearch. It should work the same, however. Will test again right now that I've created it in the future. |
Confirmed. I added another document:
…and changed the Logstash config to:
The results were clear:
Logstash configs can and do support date math in the |
My unsuccessful attempts today have this content
Can you check the multi-index
please? |
I tried this config:
…and this was the result:
When I try it with hard-coded names:
…this is the result:
And when I try URL-encoding the date math:
…this is the result:
|
It should be noted, though, that multiple date math in the query string doesn't work in Elasticsearch, and gives the exact same error that Logstash does:
…results in:
This suggests to me that what you are asking for is not even supported by Elasticsearch. |
Oh, my mistake: WRONG position of
CORRECTLY
Great, this rocks :-) Thanks a lot! |
This discussion helped me. Thank you. I appreciate it. |
Currently the
index =>
setting does not seem to support index patterns likelogstash-%{+YYYY.MM.dd}
as the ES output does.Having the index pattern or date math support could be useful in some cases. E.g. if LS is running for batch processing.
Running with this config results in:
The text was updated successfully, but these errors were encountered: