We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
It can be 50x slower. See the following benchmark:
julia> using Dates, NanoDates julia> using BenchmarkTools: @btime julia> tstr = "2024-09-23T19:32:06.202111" julia> @btime NanoDate($tstr) 3.261 μs (45 allocations: 1.93 KiB) 2024-09-23T19:32:06.202111 julia> tstr23 = tstr[1:23] julia> @btime DateTime.($tstr23, dateformat"yyyy-mm-ddTHH:MM:SS.sss") 61.379 ns (0 allocations: 0 bytes) 2024-09-23T19:32:06.202
Anyway to improve it? It becomes significant when you have a column which millions timestamp string to be converted.
The text was updated successfully, but these errors were encountered:
For the specific date format you show "2024-09-23T19:32:06.202111", you can use this:
using Dates, NanoDates function MicroDate(str) n = length(str) n !== 26 && throw(ErrorException("bad NanoDate string (length ($n) != 26)")) dt = DateTime(str[1:23]) NanoDate(dt, parse(Int, str[24:26])*1000) end
using Dates, NanoDates, BenchmarkTools function MicroDate(str) n = length(str) n !== 26 && throw(ErrorException("bad NanoDate string (length ($n) != 26)")) dt = DateTime(str[1:23]) NanoDate(dt, parse(Int, str[24:26])*1000) end microsecs = "2024-09-23T19:32:06.202111" millisecs = "2024-09-23T19:32:06.202" # compare julia> @btime DateTime(millisecs) 71.914 ns (1 allocation: 16 bytes) 2024-09-23T19:32:06.202 julia> @btime MicroDate(microsecs) 79.115 ns (3 allocations: 96 bytes) 2024-09-23T19:32:06.202111
To handle 0..9 subsecond digits more quickly is possible. Doing so for strings with time zone offsets is open for a PR submission.
Sorry, something went wrong.
Thank you for noticing this -- I will enfold the faster approach into the main code with the next release.
Thank you! I currently using a workaround just as you do in MicroDate. :)
MicroDate
No branches or pull requests
It can be 50x slower. See the following benchmark:
Anyway to improve it?
It becomes significant when you have a column which millions timestamp string to be converted.
The text was updated successfully, but these errors were encountered: