-
Notifications
You must be signed in to change notification settings - Fork 66
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
More plugins - what do you want/need? #6
Comments
Diff plugin, that would allow us to create series of CSV/JSON files and display performance change over the time. |
Thanks for your input @hauleth ! With "display change over time" what do you mean? Do you mean just some console output ranking the outputs by which is fastest, do you want a graph.. ? |
Let's imagine that you creating library and you want to compare speed of different versions. It would be great to have some way to display graph of changes in the same functionality over time. I am trying to write something like rails/rails-perftest for Phoenix/Plug and it would be great to have possibility to graph performance changes over the time. |
It's a nice idea and I know other people wanna work on similar things. It's definitely possible. My only caveat is that to be accurate they'd need to be done on the same system, with the same load and same dependencies and approximately the same data in the DB etc. Might be more accurate to check out different revisions and hen benchmark them in a go on the system... that'd also need some serialization and graphing from there so the work for a benchee plugin is the same 🎉 Thanks for telling me the use case, it makes developing something to aid your cause much easier. Although, no promises to a when :) |
Unless I'm missing an obvious plugin which already exists — it'd be really cool to have an ExUnit style set of macros which will allow writing multiple benchmark scenarios in a similar style to our tests. |
Rust's criterion-rs will automatically show you the diff between current and last run, making it easy to check if code changes provide performance improvements. Maybe I'm missing something, but with |
I have written a little bit of code to compare benchmarks between branches. For now, it is just a dirty alpha version.
This can be added to the benchee config |
@yuhama there is the Saving & Loading - you can save the previous results, tag them, and load them back into your benchmarking suite where they will be part of the comparison. We don't show explicit differences beyond that - I'd be happy to get input on how to do that/what to do. Criterion is on my radar though and getting all their statistical stuff in is something I want to do, but might need to brush up my statistics for :) Does this help? :) @NickNeck thanks for sharing! |
Ideas for further plugins, open a new issue or share ideas here :)
The text was updated successfully, but these errors were encountered: