Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Millisecond precision by default #207

Open
vassilevsky opened this issue Nov 26, 2017 · 6 comments
Open

Millisecond precision by default #207

vassilevsky opened this issue Nov 26, 2017 · 6 comments

Comments

@vassilevsky
Copy link
Contributor

Hello 🙂

I think that, for any serious application instrumentation, millisecond precision is necessary. It is a good middle ground between second (too big) and nanosecond (too small) precisions.

Making it the default one will simplify setup for new apps. It will also help avoid surprises. When an app sends several events per second, but only one of them is preserved (by default), it is quite a surprise. It was certainly a surprise for me.

What do you think?

@dmke
Copy link
Contributor

dmke commented Nov 26, 2017

I agree: When I first used this gem, I was surprised as well. I don't believe millisecond prescision to be a good middle ground though, as it would surprise those who expect either a language-native or server-native presicion. Because of this, if I were to change the default, I'd change it to nanoseconds.

However, there are currently hundrets of active users of this gem (deducting 95% from the 12k downloads on rubygems.org), so I'm hesitent to change any defaults. We have a config option (time_precision), and a »Note About Time Precision« in the Readme to counter exactly this pitfall.

@dmke dmke added this to the 1.0 milestone Nov 26, 2017
@dmke
Copy link
Contributor

dmke commented Nov 26, 2017

Let me mark this for the 1.0 release.

@vassilevsky
Copy link
Contributor Author

Thanks!

What do you mean by "language-native" precision though? Microseconds? That's what I get on macOS.

@dmke
Copy link
Contributor

dmke commented Nov 27, 2017

Language-native means Seconds, in case of Ruby (Time.new.to_i = Seconds since Epoch, albeit »number of seconds with fraction«). For Javscript, this would be Milliseconds (+new Date()), and for Go it's Nanoseconds (time.Now().UnixNano()).

Server-native refers to the language-native precision of the InfluxDB server (which is written in Go).

Sidenote: I recently dug into the Gitlab source code and found this gem to retrieve the current system time without much overhead: Gitlab::Metrics::System#monotonic_time, which uses Process.clock_gettime, which seems to be 2× faster than Time.now.to_f.

@vassilevsky
Copy link
Contributor Author

Yorick delivers, as usual.

@stevebissett
Copy link

Just adding an extra voice to this. Recently got bitten with inaccurate metrics.

Been using InfluxDB for a number of years and most of the metrics captured were at the second resolution. After capturing more regular information it was discovered that points were being dropped.

I'd definitely be in favour of setting the default to milliseconds or microseconds.

Also, if the default is changed, it would be a bonus to not have to specify the timestamp manually in each point written.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants