Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This thread is about infra monitoring, where any of the potential performance differences probably don't matter at all. If it does, show us.

I don't work for any time series database provider.



Why performances shouldn't matter in this case? The more instances, the more timeseries and the more datapoints you have, the more you care for speed when you need to query it to visualize for example WoW changes in, say, memory usage.


Consider the case at hand, involving streaming charts and alerts. There will be zero perceptible difference in the streaming charts regardless of what database is used. Alerts won't trigger for whatever millisecond difference there may be, and I don't think that this matters to any developer or manager awaiting the 3am call.


Time series database performance is not only about the reads.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: