-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Chart throughput on parameterized benchmark page by default? #9
Comments
Hey, thanks for trying Criterion.rs, and thanks for the suggestion. Yeah, that's reasonable. We could probably skip adding configuration (at least for now) and assume that anyone who configures a throughput metric on their benchmarks at all is probably more interested in the throughput than the execution time. I probably won't get around to implementing this right away, but pull requests would be welcome. |
I would be willing to try and tackle this since I'd like to have this feature too. Can @bheisler maybe give me a pointer to the relevant function that I need tu modify for this? |
I think you'd need to modify more than one function... For now, let's scope this to just adding throughput charts to the per-benchmark reports. Reporting throughput on the summary reports raises a lot of complicated questions and edge cases (Would you want to have both throughput and execution time on the summaries? Should the violin plots show throughput instead of execution time? What if some of the benchmarks in a group have no throughput? What if they have different kinds of throughput?).
This will need to work with the custom measurements feature I've added to 0.3.0, so you'll need to build on top of the Yeah, this isn't a trivial feature to add, partly because Criterion.rs' internal code isn't as clean as I'd like and partly because it interacts with some other features currently in development. |
Thanks that will be very helpful, I will try to look into these pointers to see what I can come up with on the 0.3 branch. |
This is mostly future-proofing for #149, but it does allow format_throughput to be implemented in terms of scale_throughputs.
I am using Criterion to benchmark a function that operates on a vector of elements, using something like:
The benchmark overview page shows the duration that each iteration took, but that number is a bit useless on its own. The thing I'm really interested in over time is the throughput of that function, which is only given under "Additional Statistics" on the details page.
It would be really nice if the benchmark was configurable to show that throughput on the parameterized benchmark's overview page by default.
The text was updated successfully, but these errors were encountered: